var/home/core/zuul-output/0000755000175000017500000000000015136453015014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015136466333015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000206430015136466245020267 0ustar corecorelzikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD YˋI_翪|mvşo#oVݏKf+ovpZj!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\[.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €'} S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>ClsXT(VIx$(J:&~CQpkۗgVKx*lJ3o|s`<՛=JPBUGߩnX#;4ٻO2{Fݫr~AreFj?wQC9yO|$UvވkZoIfzC|]|[>ӸUKҳt17ä$ ֈm maUNvS_$qrMY QOΨN!㞊;4U^Z/ QB?q3En.اeI"X#gZ+Xk?povR]8~깮$b@n3xh!|t{: CºC{ 8Ѿm[ ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtW#:7erԮoQ#% H!PK)~U,jxQV^pΣ@Klb5)%L%7׷v] gv6دϾDD}c6  %T%St{kJ_O{*Z8Y CEO+'HqZY PTUJ2dic3w ?YQgpa` Z_0΁?kMPc_Ԝ*΄Bs`kmJ?t 53@հ1hr}=5t;nt 9:I_|AאM'NO;uD,z҄R K&Nh c{A`?2ZҘ[a-0V&2D[d#L6l\Jk}8gf) afs'oIf'mf\>UxR ks J)'u4iLaNIc2qdNA&aLQVD R0*06V۽棬mpھ*V I{a 0Ҟҝ>Ϗ ,ȓw`Ȅ/2Zjǽ}W4D)3N*[kPF =trSE *b9ē7$ M_8.Ç"q ChCMAgSdL0#W+CUu"k"圀̲F9,,&h'ZJz4U\d +( 7EqڏuC+]CEF 8'9@OVvnNbm: X„RDXfיa }fqG*YƩ{P0K=( $hC=h2@M+ `@P4Re]1he}k|]eO,v^ȹ [=zX[tꆯI7c<ۃ'B쿫dIc*Qqk&60XdGY!D ' @{!b4ִ s Exb 5dKߤKߒ'&YILұ4q6y{&G`%$8Tt ȥ#5vGVO2Қ;m#NS8}d0Q?zLV3\LuOx:,|$;rVauNjk-ؘPꐤ`FD'JɻXC&{>.}y7Z,).Y톯h7n%PAUË?/,z_jx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!frH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{BpY]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIlؔ?v\<uV?.*E!qQ5m㎤9I͸,0E.ŊygcEl#L)(g4^atNbe7}v+7Zo>W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢeKcA`m- } vS¢=j=&NPt (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނLٻҦ62L0ډ"ܺ_z9JNȯ=@oUI y4!EZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċwml"Ms>\΋"?|NKfֱn !4Vڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r8/G:T6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁ+mY/%x c8d7&)qM9~o[5H2'R<(}UEv[x\WQy]o#}*\73^B+RVr=69Oc3쫺0utU!Sh.ORㅱOxLOt㢚EVx'9W~ErKT^y蔔ay))1SR6 + b9K#207}|-rr /<,\Ӱc?7Ww'o_Wc;UǦ;67]#nqr1wfڀ1 }m1.YRe=u7]}P;gy3q9]ɰW|ø?jĈI&y:x'+]OwxME&{];vuG8#+0 %^Nψ~I.RCn %}| q8K4tu;6P @|sw8<"8e,lM׺Z^KMp-b}SȐ* l|2i[ {Dh&kAU{,=Pc47*<隖zK k{"kWOZ9c7"BwW 8[ -b{Ca /E&U}KD{e28 k]?ZXng<oz}IaVBnNX̒|@%"DUunJFQX(־H^K.Jό}bĶ,}rmDQ neَ pcPxqdOf YfS? :GqDQ/[-8E 5Uʈ0[yuY24|w?sq%`-[)4mŚV&D "ADbF'ը`%@Q5M #"az.F}r5 Q:Uu/h*h^EzrGwIͪjҮ%X*drZZ`xSAZYIS7e笻fJ>0koU2Gx_`|<3IS`)21Q _.:3KVS:+.{dkzdl&# [FV+C;n2Hb9ը#ǿ<>G_"}^Xu F?uG34ǔ]28AF7HXdЈR,Rθ;n9؜?r. |+yKYȑpd@0FU.@06%˧5.H⧦fyr>&}A^:˦G|rU7 k{+#տٌb caYUe%)놥?}Դ߽N+RPGm _]}8̌l[GR:}z#_{{.u{睽LK6l(<ʦ0Dr n쵽SI٦Kxqɛ֒۫/m%Sqӎ+#"jʤ۫4d * OOނ*"]T)#Al*bY_ ?:YRU}N0ϙ֥`IgOa%1tj8|LN Byie:]6 ԩ9 $&[e8Mf%l)mp؋-4)Vi0k|9nEIW>6MϠCf5-mC)b˚e-7VKؼ$"ap&:'dD]ֵf^J[y_L Br)FW|\-+XHeCED<>߇mmt N8Y.ӷ?0duꂉ[ 7R杺 &{IPӦKLe 2hƒ%yr`~{Y~| eQ!2z*ڦdT0"AGx lP Z*mO!?sB$Aq4~AȲ |Qμ#E"ȚpU&_>}hKYy~9]{$@pi`_6-U)hP˒7>Mm_Ӆ]S{QddCS&0,s&! 'ooFtuY+UlV!q:z둦1$j뵨{\kI:}]拏:MQ|HM]%עH{/T:ExpEK, &ks=޺pij"Kyqx"`v@ ^ѐ?h!e*ehm[$B6hEEaۣix,2pBp؇Ir. ~<4(*KD~H֚'7(*˓|s"vf,xt/c&FDKX)o#* ~T_&` Ni|]EM6\[0 O*jmܮ5!1㈬Y9{=bqD&:C()g_x:g:b|T=JdX)rjSO[)[)'R-ޞD%i,ٛ.-k)-g \xŒFQxCϬh[j{*-+o̴ Z'1QV ;jYף]sy}x!T5ٶWWTK=|vmMDq<cty5;A"lf3x9{![ǖ?k=݈lwt>m"l$b6˃YCt+hm(z@:ljʭB.; MHaq` gYևTSwE`+%vXڶ!1te-:0^,ɨ\;t 7UҽNLٮ! ʳg<؝e&RVx*5 &bb[}B麕֒Y.K0`'itBazF8PH!fI  -ʖ;C_KҰD0U,x-.ߍѧ61ֲ L˦B%C +؞}5µ,KLaqM0M 0gǣ+oO#h)M!TRq&c~ 'd2R3*RW9u7hѪsv7FE *$[&g6Ofss;)8C$lH8D&t=ߤiTU|e"f֫Ak"*"u|cC?"㵻@k61!!e6ab7I^Dw]Wh/Q늨D]UjsX֒JMv]U'M PuMCEh-`ZSC+d㴩뾀Cs\Tźކ'R1l]K'R7\yD*o)gې<;un "EE2LaqW+æZKU)ڄK55&! ̇C"'j)SS|0VՃՐ6ΖXv D}5Q,9HJ.Rm6JƁf [n۶V/A$zlE&ojƟ5cOW=^}xRS\["bMkK 8F` ;6mm/^2;wwl+6_=aXx5}?yy$247ajOS&]Iz(6gXw--O/2Ӫ+/  S9Oսч7խia maua[Xw5g㾶,ɂ8fT#DT/p}@5D|U:7f~_U34"U鎬ãK]E b}U*u;~ 'lrGj}[Wj{WI7e\AX*VFbo; }WX(Yt,1&xٺI!wG\V>x˜MNntU~<< 8},z3wc>9Bwi\!" NS4#wܡ1yy9<'1<6qnntG"g]J;#!%Wߕ)ލtہ 80䈖PAd7q=CaSuKYD[@a~Hکݑaۯ:;`KR 6xkn?T[b/a"2>tQ@m(؁9_h( K=L)`ʦGȖk`\?@iﴉm !`w-.Ȥl063L@DS[Jv\#%r:N%GCїyte"pb*w]Q]F'4bw&2}!㺎b:=,O` 0>kE JMG֔t~.Ro7T[!XXdT 2u$j?}(ˁ"#p~"Fۤv5и[x `CkXLbc[w?yzg/Gǯ8:i+_ѐ/v^ixc3߱b`XO <f^: g֟^zx ,Qw >(ՁE5/az@ړl`y*wz5u{WN8@v܎` q{P /~&vG{G B{BU=;"w{€/Qwt՗:[(\XavnUY`4̴;DnNw?O@a}5r{Ƅ8XOD;;@wywMS5U‹M̲El[oZiB-Ff$ OP4t\;|Q:\q\Sψ-rŃ8e<偘6@ n"@NL*C9pLq9b~^s}N{ |ȯ#DeYش,: ,5JR_fw/l6h- uH( "B) kq=Ϫp8`f}eӡި|fڹ,믫>!I@ xF6=`D!q?o4*A {dho2}7}f` q'䆒R^ؿh4FJ7k0CX9O}sr飮k߇J,3&G: 8lQ?$vD)jhM%[oN/'>%utkd~Eflm7;b(S a om eiY@UZ$!L^>cÂc<Yf2#hƒh̴d6Q/h[A:5n54>+@E({Qf9|6T*6l\lD(nM0؞]W4_jn5=mj pa)b[9U)\VV*p*hT]bwV1=k|9ϔ5֦ 3`Y^# }9wj,ϱ'UoրY1L[p|4OUx#>:}E.hX]UMkxBl/EXr?ӢC#T ЁݛKq V];G7_NOq{LǷ\0AY?bģĞ2灙qEqb%UKFv׿k}J*}$G[KQUƽ<0i;5֬N[0MngZc6:ekԂۥs5Z`7[Ֆ|fݾ)O4.bFiQO#էlnsWv!E~}Dk(k' |ᗷN3/WzO 6n>{< @L6f9)N.fAdk9Vx 8DuLl?Jt  :" !((8GaPMAz^h; hYw,,&AhQSRznKC]y~i S+('_nePIGMسY+ =&!X}UTf)00 'v^g]j%躑װM7`[O;v+jK~NoIHe$T.u]QvYDaER؇<hkPx @=RӀe3|!yִJ JG(ݒP +]J7SEg*ݶ[Fڱ6]ܾݺs7]K#ޒR{B.:|*Qe4@Gi- ;8*CW_wdp:4n(G ҅R*q\8b 1'y."$ܔ_%±X}Oye^%tDQ:PAј3Twy,tH9 atE G,0DZ`Aˈ/<j"H~1aj9[`L%SZRӡרm+gb *$~ÀhX ~]|%o'bH[kU1p}E*;id,}laO@TCc:'¢׫Py;IrJ0S] yugY-@z<lwf䣲zq'e do:&w=8-w_3 b(yrQ0güeu}Cm+ :423 E<QQW<%PAujB=k"Gv{Qr(Px gDBKs H!8=V*[ _/T1Km|+6a8hC-/|B ;t|MH T1tV!R;k#.)n-H!J Ҹ"wx*2=8|:)nۨL0OkY;AX =fzYY)+M 1%Si&;XA<ѭ9|GًPj"Q }Y)IC RkP9JPԄ -6_xd;DjN=<QRC5jwe*@gѢd jZ*@k mЉr(Gn*K'<[x\D5!D;@nS2C=?QգQe Vn!@ &oua .XM;XzҴB.G6"%(눌q46~]Q 3d?[^ƿEʟBGt!mO?%Z?(潝;>%Pf.W{ҋ um=c)]^'pN1:ꑇ h7frsFi_{ъ2A ̓@&-elj/Ǵ)D3(TLnoh849֋j2rb>儯 3E 89P MEPV'eWTD1&J&ZȭIE!ޡ"  p\b@P 1Wgà NB6R@2E҅-oςco&$I__Z*oD[Hŵ\ȩữ2-DD׾3 <$؏Vy@-\re&o4/{iLyԡ)~4Y4sÙ"tk)y<\HHH_dQ$-aƸH{e4)W07, lt&|cM([,5$Ha^̷jXpm~1pQ>Yc1. [\v3W[yL1Z04HPvR1Bj?nwmFTe6%V4PAq&?n=D1U CNåkF1Qx(-Z1WUԪ/&&Cj$]˷ؼaUdLj4=v:?ט^Ź"k0уXl r7!+)ROQSI~;=yZs>cG2g!Pܨ^bݲ`mœ8 ٻ)Ha=7cU..Ք/9ڹU{-ʻj^^Yp OtƗˏEH )A҄.͍:%9@v@ќG-W5C\#Co&'cy@su&{ "_csP· )lcLNO,(cuHAV9Eeh=G)NV㳖ZlMRֈa7E5?=ش\Pc΋eXU q,YѳfD?Zݍ},8,%ZkVDfeTbjR!'Ht7z;Se#c%F?9)U Tk0)HyBX':uq"w)}P8:9K:kMF{ Hb QUK m(.}܇_yƯI:`,0Ȳ%1S$1qU]3p(o;Y4dFJ&ykһɥkG*HֻìdJwEfLNk>px/6ZDK+LZ`SZ&RaɜdN!l^U$ɌKV/ F4$T]T5.~U cG&A@8m)*7Aҩsr<Ł4J"%Zrruu*k2 Q$]Vf],8u1*3:f>\V"jB+^ZgB%Q|2JÇcq|N7ɕ+YC } bjIڠRHHzfTf)JprUk'ۄ‰L }bTcqQTQ#q4l6%3.-хP$|b/34zGQqK`/"6GwLAɯ E&Gr*F(UN [UG0UQn.By132őD?.qꦄEjVRS{ܶ'!u@,H(S9m)M_w톉 WhGSv?+l`?!v mT&=vUR7aoC`iEWuC=w&"A@{E送Wm5Fq1^vn1Fh> )rq<'W4>`d >90״g)oGWE{ēě{cJYr s^'3 0TM4Gj6E ;t֭x9A!HɥkON,. RNN\ܼHg-]%/Z͔,#_ձݪ2eE-}(D̻hÉq6R5F[ $U$n04*k I[9&SpFD і).kgeªΔEnsT[^~셿ᖫ􅩜]1EIs4(qz!ab;WUkQocd"U P^a t./1YnG=3xt& #R d>k\UjE4\Ց~YTj.I: O}@ɓ ? ITϸuuF\Ai9>g3$}v\ﹴ|iSmݰe$X&/<;yYIT1i?u>" ǯ=F톇o2ZW]p XoدqO,8N 'W?x_n<NU-q$}cQl ŕ&/OKF?+N.>ogObt'_2| 5nl쵃ŧufT1Lvt'orU:%[]P\Jc߰Ԉ) g˗|XpvU=c1^/>|JdQM`{ղ#w!&).ox]ed?ԵGD tZyB8ZUi4Dbl&{VutJ7L~t.ÎO&W2}Rch0u@zBX~P$qZ[ /{w,r<~-)ȣ\p;0~-ԉB2ŽdԺN- )o*7ղͪ2%&!Ә0dBMl SS10ݶHx̃ fi)(L]5k-z se%~{WH([z1L>vb yyɃgm8lOqcs&9z٠1҅"D+HlݾYq1GMjLjD]0hu謢C$i"ye.[ aQ.n_7'Wf;I ~LnDoqm\xVHR\xט{@T. 5 z9$Ljk2&hAK/,8FFsZ֐}` 졵q\J"*:A҅[˂dk-p\/,8Z5WK^s FH3d&Gؙ9W_N'( 7NU#<`A$ w6~?A0$;MD4"VEH4%G"OUg+»vUߝOcD@?~4jPM~=RZT-Zګn[R~Kb:xĥ拚+PxYl<2lϓ%x2+2h?{%+sJStf^9D :1eEA'?%$L''k2_N׭&\,t o̧'Ii1ͺثY£|sM8`<vkpun 8,5Fyik.ewlQpc\hsXe+:cy(;lNsyNf;ޝ~b-$顥DD_%+n7wp9]$PWnEۅ .u+.q&5W3ļT3ױ߳Xslb:g1^*? d !Evo^̒ /arg&?!Tތf8H=E4wKС\Y؝*Cr(e^i~L|&W((g>ßƿ<&?샋*ŤCʡ`/HƇUݟQQv͎9B_̧ h0Ldf1HeCo4z7F l5+_&䒷dZIh~ g3+F@=9^Ux Q$-YtEt{G0)MuEu望'[vN^a^\Gx nNg:]qDAByΧzN. k7=c> ~ ǣiZPVՉ#iqm`֦޺qfݧ޸$ޛTQxIW/.3a3:?X@Ʒ׶u9W"Wޤ 9}N mb+.u9y[}stF񪤯ip_Q;g .x6N>?8zVBq_YYnĬ;&Wfvq<8dG,= ,#?|J8UK* - l|M;+Yn&וUM{;Tmg*TM+,ј8+C2yJ_@<^^"(6F$d+1zeH1fA#ONIc|٨2̏5\A5YNatrσ{PA~?kr{M{7Wg2ܜ^;M пbۀ4"}7rҗq3N<{\^H3RtG[7PMK&=LxxS1('Hp]̂G7tAP q{V֑F6ZU-}b qBR-{e Q#k +x=͒1#iE/" ?E$G>D ~ )EXB^qĎYf0m2B`լOÖ~ #ݷ5l1ӔƢO@j?/OUa0`?]` v0: z'0bKk4&ys6%S#Q qe 9pPT{Tr2cZNYdFewwN=1@1@L`f2 iݸYfCCߗdo^A.}宊Oݬ,)toI@I(1` ڟxf'@*Hoq=~7}AU{)(=Y2q+j< qyot,,ƣ 0K5"De d")³Xzߝ Ŭn3ptʹ?LAYG֒[D0NP!cXGQҗkZZWZz܈y3H害V3γЊ]3kImuyF&QD0ՎzKT[) N*t4A3zy>=Mx`)h=˷[Na-B0VB<qז1^]5e){uYCJQn&_rKsM{& i)Gdh@ 0Di]m&[ DYyd6ׄ)(p猔(B{2{%/cBVR+m4S~ŽnqYR߿sA ğ-GMw l0}B=o;0f~j.|x!#Uyn ;Kc3+vE/{2BHlώDh>;7ǻ"ױՀS#_n:8*L|1 xٹZ x~*51m`vTJ+#jkٔ5B^D}Em0k_,BAՖ|-k%.J@r4*Iƫ_’"h`^Z%&^m/~Fm/}OWpw^VٰUl")Iׅ@o{EۆՄ7ž0-auoKڛ[Op*oiM=- |!PH-7 {8B2U&I3p(ٲgmp|Yσt>͋["r­q/qG;b"SLi`܄CD; A $88QL?ݺInbŶ7ƻbH9.vb L䯳JN֕?ֹB<;wR|Q?ѼY7h}ـ^;f|^*x͈DB /Ic!(R (.8!e*9xP}[ƺ,n2 PE) Q`:R,Df"X@@ee3#R텟ڝ*@5Zw>}H OQ7{Wʾ -G; BxNmͳR bg'/|J=+͇wpy: 6'7dk0IDP,?Os@=#&[Xrˠk`ˣΕD \,^?w#I'q'CNu/?6иƍW@VBX8 A~/V}햌ܿ:GohǷ?|3JZ2&Rq@EõK{b8 O >3_͉m<|ŭ\-  l_ˡV k/.Ͷy+4%[4e;ndxʧ^b*" bDn#45yh )~@ ĞjCDJ2Rmn8` -))$iO @<) i)0,S`.XyiX# h0]rKQ ?'G@'Qan;f>B:y㍼|+.d~>RC;_ =y?Fe6;}=H5zRU^XCUE5B 5GKIb76vSRh0; h5,(gZ㭢k vz3=KF)B̤ a2* pz"NY%n%^#J$f25qG4,5`H`ATm };TJ b t A8v" VQ`o cZ~}aGNʅGU3OEEy ))ӽ()IwÓeN,rMLZ~%Sy"1Ij$a:=jr52c?7@&TXH,*y4XՒջW}G V 0 .i.!(QV/D@0|>ۭ!x|݇PC$o봠?)#1K87TL rFYZ q#*R1&«C4f2knA}7A{jђ#AI2G< FhMQ}G]:k'v' }4G3J |lW3mS^n(OU]'jVPӼWWV$ OJ:Nx :2LqZЭdFBS4 -OEroXtm05+/ZPY#zA2'iH+i@OҠ(XLMf@y;yQR!dgM`ʗ8kh5CbjїhWhpij C2_9? 6O>kJt:aְNH#`yfp*Nh4)z$ n3sR-'hJOfSp?F0"jXAǣ!܎+5@KNm~SntRCQ,I Mdyfx=Nm͆]aCnXd;l^-ZCGCF@(٭ u^h>{wQR7%X?*+:Gj h`){6|*Bp ,ܾ9xe`;@ w9kD.Lޜ({ Y5k#bGnqb'zv꽺w72&9ZΒ 5Z4s^30Ϋsy2a 6$'?y 䍀iuݞP-ELD#ǜ&D1l`5馉nIIר1W#i@^tOr7|MuT2XQpZ]./6Mew-B~Lb1t: =6Ҩj+~-w 36Nh΋W"#(:Fh[3zjlqO3 B E*GOoɾ| Y[O\ <xQ(^֑zI*4;^}&Ͱ {x>}wwqa?AY>yħrn}̧&>}Li8KNttk3uCz8^}LbJk<>޻rU_^D #|i$G~Xh*/t2vɏGՏf.US*9{XXzL{H\3Or-@3=[YÚ,߿}~}Kͥ?]yG9vEf<=\:]qZ^a87\_{}V\VN7fJ0;Ij<;GrlfW=ݲzrx/]~-t*2y*h9CvHU-?wʿ+U/۵_S D} U21} 9gQG4FETҾLދC/2pw;?/ϑci8~a 3s2f!ZJC77 0<"?,R½К(%S3T Á5/m6Ay]nzr6J~ 䡩Ng3IpGя\l+T]5)gw+oӢByXgg-Y^ S/HFoZ^F!p9Y$brb_4R4Ro\'dy;WarH>g6,pڝ6\4ruȦtGK^حԫ7́/9\Ch vx)4Tq"OˆHf-` m}s$اO8LVYbs8F]zz>h9l_C|g\C:Ir"N͔55J_,?E+7Rh$?6dӧ9 )tfCB]pt*holjI閣0mas3oC{9xܦbO]C e>0Amk(^UfU5}/.э5[dW߰}_/SRJ%9s1uU&h@fr )B3Gm%.̈́yT Vs!H2#gDjZ̉i ))u6HPB,Ż:&$ 4;;{0P1 #)J(ODGܛ䊞LVFMcCg5SqpZU)a#$hFn?OIj^ims4jjb_ⰴZL4ƨ>iI=,?j XC -#}b.n,D)3;5#ԗN٩A7WnS> G `W׀n8$*Ѻ I y de Pk]>]$Fo]{ZV}u(-)tO BX,?MS%ZQEnZ?gJxV\X*:qe, rþ}VLd7vLl)^scG7Fs((|(L󮕻(/jp<(BHG( aׯ9SDJήD(G0iI6UAb'| *@RF#_yLv~VLaic,Wf4<+4u>rUS(W*3 \|\Q-{r8E+1_WXa$ɈcJFˋ`UAbCg{Z)ߗL\fysdӝ$ Bs E ٚk,R4GVImGQFq7V:WMEރJ#,Z0`C/PpMG2.-l}qʊ.ELv7]83 Ta<7-s]{M埣/]˴6o~[qFZ~uL^(&F~^Xim7Y]+噊Z멻"Q8WE9cfPJ;6Do7np>8ng|v3 HCG'Bn݄k~kJvFEDi1G{p7Fc)6qP*N%!މ\qӇ${Ќw*'}wUҶ{P%;M ˴y!_hB.鰌[ߨ$3Rr?/sַ0 ^i5i]ўa~c ++zϕ#r}s) Jj Q!~T)O̻'*~n}XPWf[|1=زeBOj=x5sdn~8L.U-/:m6gXĵxs>>"ôχ_R!DKδQR`@:%#.c8^Z | .KHq$J|uaSZO!|[I3 >%d0<̡"XhPI` F %0ZHZ鮽)Cutd$Y3Fl L*Pֹ@}`4cxc)?ƾ Rr.O)UJ);6#cH-7\VhcD h ;(ʀela'r+Zdf0TruR უLLxshkV"^Hf I-nC!K )p0c7fJ@iF+4^ sfYphq{ɼHi}C9m4z0!(hf:2(]bz[A B-2*0G~w@PRk#v4*l22o2eHj%КQF"V@ujDbJ?ԦdUVz^04*kPjI㬐9n *MQ(干QJS_oniTyP$K1SB9q*6ɘ$\fYJk%imCo EáXqjcu)NyM)|fC2O6OAm%yj8(B ֎1eTT,'Z=iNRl, `I FZ&$e' MdRDwnSշ=NUF9*3\i!/ Cx0Euya=טuv9`[U!o(ڡz*IUJ'0Km`q]$>v'-` tjU!RAJJ#L2b@fCPl\'d|2&d` *,ӉPu4%M%`t/-V0!ЮpYV Z$f AX[! >j @ mG"33%̨6Э9+ ,,C7a?%17)_J%;&|JAZ\ʁAuL:vҁGbUmj(lDC0x1oڳFx$(EydP $6$(ĩ^i۫+QakZ#./[ܸ.iB~իf@!ѿ&伦#dJ-J}M m."2Hb0۽@VQ=j+P"t5@gmL MphH3f(NcB*&6/ Q; ':L*Dh]r[v&^Xw}Ӯz|ouWz݅L m0p-7T=olKo9@GVLAG=(]Iv!i$RIUFB1Lw؀`%͐gѝ8Mh%J"-; iCރZM'-=z!u,%ט(a:J`@MC}wT%L0AlmRih\>h=mNƴQ|H7OFb?ǥ2:R&j @ԭGwnJ''48z63 } \;-,:Zm5EZSRu%A'%ޮ dL=@9a#ҷ+͈]R@s8n𠗨-CۚbC+Bq&S:)P 9,J^'y/*@IP(E@cD&7cHb:00Gu +~xDmDV(ZS{N낑 k+Me0j̤yAha3BBv ZuZw*tS!*>@MVk LFs$k VPVp*@#-]!-l9E +B~)Q32ܪ8QU{*: GAX2\I2?xx@p4Dk̦rU3ti J.YxY 6iU."!:mO]XzEWKh$Ti_ m#358 1 T GR/Z*|ԍWuib~mLUCP;D6tM +X|/׾U eHzQtr#?s0xEɺ|4xq)W mɿX]?rǼINjevs{ 7N衋eq:x1fNVy ya]ɯW.7?{z ]Mw2sA2pc `>I=yPZf6l2Dz@f6ه>la}f6ه>la}f6ه>la}f6ه>la}f6ه>la}f6ه>la}bCG1h>*>@v2} hf6ه>la}f6ه>la}f6ه>la}f6ه>la}f6ه>la}f6ه>l9`12> V|0f`N`S7e|}6}5ه>la}f6ه>la}f6ه>la}f6ه>la}f6ه>la}f6ه>la}sf%l죢p>+}V'oه>>Ff6ه>la}f6ه>la}f6ه>la}f6ه>la}f6ه>la}f6ه>l9 'kA9֫6ŊOJku:_8W>M`Ǜdhx{$TIB$uP<ոo8o7omPjz@BSо\B&((`'&\/%zB#1pBㄶ)8,'5CYCGO۟.Jxr\?ZM\rrk U1)( "zeEtĎBN1C)&V٧^Lv\LlmX;Fh>;J#'4NhX'!}$4$4UB?F( mCpX?Fp;fɕzB#8m h9%4l$4{8І'eԜ8m?%3+,WB#1JB SOhRCNh#ip_yMh)loc1MZwlvl,m:_^߮>c{2~?|d1N}}rsGGm~n\\Ï? g#??~Luht|Nn~i8a٘o;feyz3OgezV 砇6.-lp6x6V܎~?Lo_O/t&_ٷw CrAG,hYwtK8Zz5;CWYt0MǿAL 5H;sF? ``0-]Ee_P tG""tF|ir5~?{5m׼8 P_G7úV^(yޜ:h]/ x|V[ >z/~`2?Z.@߱CZ D8{#hE\}6^LzS)]dbo٧`# (jܦ>jpu|1=qygלxiϟ/1 ].sv>N1ݣnըp8:6l`կ}rܝn ?ؘS'G-RX뷶7Ӎ ڑvGTqV }nx<]]\l?mkʷ.jws'"s}|F>GSB8+UVߦݓ(PB>{hq~IachU ތߚi/M#ҾKЭOot>[e¶<ן\0zF?ivyir2L{ 3=Y,r݂O]NUxqW>wlru/j2=VlS/nTtF- `yuwE_ݫ>n1@;_bñ0'ܚݙƥ& aH= ]C[| uu;Z_zYۖAڧVJ3ķ .lvt}M"+v2z!ck Ѫ29ąNl 뚎6ڔ,ș l=ڙVKQv\S˛mGs7o uc;[Qnq7 w[~uhrs>zLCH=?5ͼmPɺ|O$ϖ[4ϋ-g5:r'[B@ IK9ڷ\tYVGbTډԪ.}f}#OCcP7m~k`] AZVTh֪6UE\sV$%0ڥ0Cok $'S0i!QL+yL>g2z$sTmh]gN灴mJ{(]rT^EJVIJ-"J䮖1+IjukܘTn8ۮ~Ya8Fڗ0O:͟ ^Nk 4־LE7\ۗ*e'_s<עL-%8HSP Ҩ޵C'y󔦂 YסG= O掵F ey/|EI<$~+rAb:-E[_-wf~Wڤ!{-s*C9(>EG-hx:;uh2c|'oMOΓgbdMzz*91{-;C sv;H@= ̱ {V()Rj.i:mvBn wt%f-#G\m30zwtyyה'ڨHΖ7T1kL)O(/+I1Oٝрa?`^EIӥ*uUIݣ_uP,UQ ";;/3L0*32bGWw~..6e#w>M($s,lɽh*ۯLgo:ϝ >d;ز%_A ˎZ2?=){xKcA>m>Aa˲_:;m63z??5_3b}Ϊ]fO I7i7mf[(dUϲΖk4z7[qLRVEJ*m;>5Bopvds###{rN2^x}9e.[ ^Ctp7  ƣM&m~o&oFnл n,t{Kt;ys[}·|k%jdP￟lM D&bV`sPoc x!<ہG>bE x!ΚF-T`[O3q?Œy3|,F&qۄ@D$9[DxͣgQ=k!@# 3AͭӀiIچ&i"+HRMZ5Zyѕ(wb7ui˴ .;)@rFv)# Q2"-@ MH#Ԙ;PB-۷t+hy]Z\_""5acwthǒ<-$%ŕduV.#;ǂkУ:KnhBȉ՞-ѦD@ԓ~:C>7Pl /Vf?%(nk~bI3pAPJLjN Nw380 9NK:aQ+n'Vy:*Mg#qc!Wb2OZs GrG,$4<1%xM3ޭ.xB0zhc X|,̧=zVu^5~:n4 ke;`- f YiM&A"Lޝ;k!@# }3Z^ ^+PXy",u֪>d Z9IKPuCM։Y!xt}p(Ȳ[Rbxp0kE<8?ȈBeS+08 6bk'ϭkR&-x@<nBgOC4g'ق]+ DV8Ϲ?'^?! U%Fj _-Zt82HQ{.%`fFǁ(N-=t9c]q.bw66*\ųoJ\`eR96\뱦Jy}݊3m0X]}˵bW;*UFn(EeTfe׿ (2cO;yk"& ALzhځ" P5EMQQQ !Js+Pqr+*pl`řu0ڵR;[qBk3\ 8*tq¶&R>.KU=~y{ (]luv^'zxRZ^q`#kՐNf18' j1$h ag%Dzh]78|,zr4qV\::& V6d/ɧ]>٪a(建FGҢ@T 6pMXVqsg *3X]$5lGoV]E>na}`;^!9jO'Ow!"]O2/3> )Q3.{Ӗ͗:=`k [̌/Ḣ2fXth'!<Czh#X|!2BԤ*)Y r< [86蠘G;Ǣ;__^rm5~CWQq+gSna1 ADI;-ge xkGz֨Xj\fKjhvodb[yӗykǢL5yb ;co)tp`x#3X9&3P`t.%kMto VFǕ׻݄Jq(q5NNkPU5 :#̎{k@MQVĬ)ӿA/c,k .2}񃟳{z; XyJKիJ',⍋2tP]?['&7.Ix4$&NR"hJ˜(`Z(j7y^*XG 'RLP0@`H\1sܝj?g^  טּx֬8Uɚ9ZˈuW6bg{;tw |,VLL,kK7J>(a2RKAP8ĄovTSx=q6K8‘tu'ZW 薐n>nqH5D:bf'XU q.=M?1cѝE"^FD@WדU\wK1:anbEVOv?ɿbT[؊i#MV5&_|)HY5NDR>#FQcy~ 2ʤYfBO8BzURoM?]5#lM k kcn^WˀQ5jNK\M(!N2ҔIJ[%m@1hCzx ڽ$h! /aḰ+BiWhBy͓95Zѕ(U][\Ǿ}'̦+?̕QlBCSܮW r|F⍍RyN֨tޠu9-nƱ!`#45jHn~]_6!-,R{js3(w{k 7(q0)KpAq._0*IU=U*,!ctD-zx``?)!*kKUI1<ݽPca1&ݳFc r4&&g^ 辅J߳F;-;-E>rSOu&.r ,@]@njk഼!Eo,5@9=^4PσS:aWH*))-"[<$i4jh|*f?^6;Ny)sdUt¼ؑIwmQK T u*ޑTaYqvR`H&oɞj=@SGoŮXq?WffͿJ<ҳFŞČY%ryŴYFv+8LxQɦ7$UDd Ɛ%`;+yԥ Xo[x[" -Pf!ͷQUGs>T v@l#RRJEć5ژG7" UiO_} K9E_F7)k KE*w'|9webfmFG7E>rAIBE㟻g+ s1aӬj}g+:ΗzZ? [ ( .H;0S$r)D\p}9 ylj7C>)Oj=0R5{ ZQqK>/t}KqoRb.'2+O!ө Zi . ݳ@&%1\i8'zܟRfcQ}Oi};oBuB-4֠\YSbC!;D#M+.[׫՗}kޟa?;]m.|ɨJoi7$PU7{P<E> oo7[NS`QE)QZoK#mn|$z۶F7 ZCP{Qsr惿YȤ~{`0QQ"^HT<҄ yH"`ҵQcAײ C_sτ z,19o4^JjfS<\ pZ o#1NdԐP4|# dz#=k.P E>rkdafgY{ܸ+?mpm..p2IQk<WZGK4j5GfAVUT] yR yWӯ;|l[$ռ__GBAKi68tB3iAv8lQ_1N$7~vc6(] qje+#dG TF԰~şR ?5UOS|UO{3RY.d$b1ƺGӲ4vSa~×lb{lR.uʀR9JiD9^)P3)7 \r^w 0+6"-8,^-˩ewa F.؝o8G94}PET;i ك`C}z½,*3Pn1V!<8kFj,&K &i:)DO_{_/dϧ#"m.]pXlŎH8e=a'I}Jœ 7Nh4.D'FE3˪2c'tM%.,-:z]պ+ϩLR_2=eTLम-8z$1J{S#fdʈKW9NJeuW·iBX3)T'k}歓3E&s5 a32>nh3! \^I݁7z zZ=q16C'1 K Xv Xrez~./LjHREP$8cX(dدFY 2ZˑN?tFI}ZzUԙ`c+C!{G4)V8nj̹201v$?lu*p{)-ZX<`8 k Tq)rs%ia\=][Eyp\t?RH40G lKT㕡զܧ+;#e';Q̤8NY,6c>R2~9\Jo@, uXc<)v8I.@}<݀fRH;qHysGNƎX̡̠S|$6&Q.v2~6 #=|c'NQJط"wc<ӗR BG3O2qۉQ$T9{Aq|5ч٧߲Z1e{ř_m+hr] 9qY֖%8r:r]}cS ͒X܃l: ka8,15윷IrIՂԲCW߳-7$ 9a; h4;xcY,=JQl +Ԃcs{nhg &ƳD񥜧-r}$iy(2XJ^ۉ܂FZ[GˤE`wqz Iʶ|2v(CLYP?5| UQLeD@~UHS#`Bc]6j0" "Y 1-؄o#ơ[ ت\RZp\8:q2#I[p\8046A6o޼|ݎmUicOA `!G36oc{.~Ydǖ T<y$4s,e-}Uٶ"\Ioބz\b؟D l]0/?xΥĂuX'!rS>(,CjH 0Q~*?bv6!hw3;.ʇD Utexwg_78]vwc'ϊ>0 ?3=W!x鴐OUUq> }=0^y:Q("_*ZǶ!rJ5jM/'n gx CȨ(ejΕTVѠǹsoL M-(: ; Ee>~uxҜ߶emD o=Št.`-D]L8),8d1 f42 gtL@ސuQ m6u)&0; qPAi-pV(03%Qbrc)PO.lqEYʚ"X0guW@y A=%ьZg zl"0g:O_*_|/֔HXͩg[X{YUY&R!!Ӓ&%H.D C|QUV. t!34VA7`yKi(ɉvF8vgpTcuX}}94/B(? xدF\glDX1FkUS2pOyj0 gAGka :AWʓ8ோM7ƣoH͆'X(i$`Xq>xafH9egypla dկ6*e <] -$xXV?9&Z#fQ^@ރx1 u m0<ރ?[d=FSXB?W<s4qv3k$t*n[d;eKw|hG~#*cԊ6M^ͪ:`̡#UL7S|s@s7(S6xSa{#-[kGp&#a3Xa]j?!((^ kfmbcY[(S[n%Do5EnGEWKոB6A$(ps+#}W:m1ɕv]CGڃLDg03Q26*Ή2\+[;p#KCgfo#8xUGxᓟBhp\~x[Co7 VPM}N<;.ߝmX "8rol(y>{hm E u:WKUͅ?OQLy@8i n#łҋ7Iޝi91J3h֑I~xI(]J$uتxyjl21wnȳ1xU9q |~7XHt3F^QC :9Xٓ ɝQ`O0!XsmA7礁`nVy_IxJC7=}.bp 2-Q\Q4ңx1"A7tְfy7?3BԻ*YdqM;i_݀7#-Uovg;[B^[TƤ[lۆ\0/Oͼ$6mOa kJ6uʾ#$$И*ˆ "P {DXeߵroZ¸+qmz7~}nt.hJmɯk~%ֲ먗;Cd\ۄ[.Gͺ$6mN 3L5']$IᚡVέua*1 +\RVȔk~%t$]|K[]ry/%E!sRG \+GLNߺ&+Zyz|5ظ{8ZS9j4h[M3$,ՈbH 9Nd)t0k&1FwİhI٬a6lKQ- mshNz}[p6e5g8{hBs16Hl8?L?$otNi\cwRz%Gōh@Jt-S|Ln17gp ەrƻca qJfWpx$ʙҮns} Z'rLZ9SSWjWY9-j<\ǾyfT%@=tpvGZ"ߓuq-.m,:Hm&3HSB͞σ+Σ@aH™+swqb\=ףP8YUdf` Ud`Hi +T8G)7YUUA/flpUUMVӴXRRٿfT(eә\%4g=;C<س'ӂw<,C(2nXLP LͬsCc6ؕJǐcȤ[mӇ^g VY p~VMX˲gO G 뜇bB$c HC%ޖs11mwP1b򧲆cmPX(aK o1sQoA>Z1pl rZ~x5O`X?u4c 2!{_ Wgc%͘psQ y.`qzfdRM,RmH(CNBiG5%7`wiI&YZ1.a Kr[kzL]5}59Qi }ٍ~691el!tdG G;P=2+S/P Gj+gS74h{<49*($ۗi DQŴrڧtv(Nm֍r7 Iz|ߴq* *RQ;D~C~2_>ܠuȖz  1Vl7M 㤌,lR>aHA Xks2x "7!>;7.oJw*^ݟ97)m{@HٮP@m%Qd/'CȬUii=!n<}d6C Q,5_z{{ȿc:^G;A7mwQQ>#o8"($&h=hܚĥbT(x+;_P`bwg!nSQt*w{_Z\ >(k@F8A#?" pF&˝]X[G OC:#MhŴ"<[OCw*8LHє1=Z49H'ă<"&)#/BR}ÌȪJ!TԖ*b))sJi:6DIC|T{ arHm+<,%)V~R@F)1sGvZ7l a(x8KEK0ROA8X9d"E,s)% KX/i3"1㌺הe3Q+lx(Rg_wl rZ$Q{n ]&2]Lځ˺~PgIgch**%,$;>8/=ܥ$>m7L` 1V-H8(}Wb &7ۤGe/}0bLO/3ͅ4+eS]aw jfƔlChQwIxX"ǵVq׵.n6#wOdg%0zdHԛsa;8di:i+e0{ꈆ#60+”,= ZMWD]W 1oB1QM>i=>n9ʿk1׿\Rggc2Dίj|1:{,^?HJ(; d6 /'2]9o^PO z'!lZ `xKYY^*i˯T`\2K2 Z TazW\(y0}V%zl7 Nc̖ALfsyQ@j1#˙m:80١ڑ|_Bft,\|%6vcWg?B9=v!h~>NIt;0,lF$cT3e]|fHK~F?etڡ?/Hɖ:{^?HT葥`;P|]V#.^.Q|X_wQH>d)ψ;k3#lAXFh3Tre89ha~\eGF?o~zb񡱏ާO?vV))"+II!'QnT Om zX>X~T՗t_a:Hm^VoUMz{Je􇡼'/*ϋ2OQos\U,G:ŸI[xbc‹t0M/;0=t`1U\*r5]d6- pt0cac;O`Cx4.9+ jPA[M. C[6W-wq]P)]+5*(YV魿]mg6~ODߝC|4\3VTy+FI$SB2J Q0`OկxgK nCaec*lƣ3sRrLJq١4s7 u~62,?HTN$G@.fa:֏_/'#m.:!K,_:d'W1 n1]G m;iL|ثsMGs(a$2 rIAR#,󢔼0ђ9ӧtIsi$^M*?f LFX}لS eFP㨞¿~vr8{RF}GhDQz)jz79 =Y{t^i6Huwk lnFz7J("MFqWPMe)coY%uh=:ao Z5Q4/P3T 4fب=VOv$֝00xRVykS +EV)TҪhDBvmvnآl!KSWjIM'o|k,v9xᅩߏWB.׶;_ qOjv( CB9 F Έ^8w__jI w', 7ooQ _Y"eJ*I  ^6(/Ka{lqZ ~th/)\ |%+z6}r?]ź^&3OB\$Zd([P"$1FK .rV}Bi: (=v(pX0ڎXՋF1&ÖnVS0'ҟ{\ld:7qR'jio]d6cf;tNiybN?כ G~60c ~t-o_YᷨV83S񼬴F6p&#KR\<.=x ǥ)}}ǨFtFQ} [?6"1;Má=-MRSpXG\uͦWxqyڧ5MCG! -("';*aU(5M϶宥[Ox 2~*-0m Uy?F> ^$ u͡A?~I/N_?c?4oB3^mL:+>~//۸^l>Sn&tUjq:w"me <^{z;?+ /ٵ97?5 Ԥnx?VqFE$ %0\wx/eHg(~+ 炦6Pʲ:8U0"-0lHLnRG,~ -yΧkXNwYpc;R,4I"ap[$2~= 0:u2a'OAnvAxǻoe9_Oxg%jiu;/Ui=~n5Wb=S"氅A0tB3$/l#>yX4lyMmwxjeĻ4tSo7o(g-5lFש!x٤.JDz|VXeoz'v{At ~pt zWi&jHoG;].E~.{c-s_Z|~f6E!ݘ'ayxm%jzqWUJK%WYkM߿~y{}tQ0)h (IJjE`+/'īxa0@VzfI^者O{_cP&OO5"1t|G$2Tr5N]7, Ee9~0McDbzH_1`j\2  "yM0UE2I-VSM"EUMV5-㱵a]u;{L\zgZ 9;n`E_. {J1baϯ(G[LxJǯ9TJRN{dӝ%x)]]@}r˿^`GyN6keֽ ??~WnyW>4Vl[qwq}{|ߎOQRm: Ǻd2~:)p XeeqUtV_f |3h =…6|o5mgW|V '{9.?J=?o\}߻jYw^w?b?iOr|/>I>vo5 k?sz^TGJX]Hwwc:J6J3 έ!9xpS3D fV,Ywwǽb'sSi55zXk=3m}g)2^öi)gq16 j9_dU4¯kZl$f§ibamۿiw#L %FAٵgWWƋV_`CFNr]DuZRmùi\YoֺXЖ/hTv=6~54z~ -?cU(ޯLvTuPKW\+W;HdyTQ.b0NH0O@'[_aó|( Oau CE"^ $$fg-b9CeA=^~./̑pB8'e3 9aGzW}e{]Mx6 +Ҝ=2)aZZIKl3g6NbTEmif bGӓTa~b@/Bc{9) xF[! TDd|(< 3<BKss(/+$f_E:E PԷRtur.N{D<5HX-Q&A2SJ30v>%סuEF•Md͖ GAQ<=^  )KS(_!yZh =>ryZX]rT _.FNU;߀Ks draT**r뮛|6E7S{BdJL? ф=E!ɑ=Bpɵcg7=aF^1jd €xKԠ") \T+!#C`Jyn,W4`ǤR@{dT?ʤhh N"ۅOVԅTyZ1 FAAF+x09Rm'oF2%~Ѣ닻q#18wKG81CS iFF䀴"!}LpI@/o GgAƁVҍ5^ˈ޼\!nSDy@ bD8'[a@_4i{ z5Dԏn5< 8+%$)b#Qvl9Q880<$$ ۖW?5!GHjР<l2 焍 /vOZF[pj5*Hh NDO(LQz ZUUp1E#m^h m}ZâN)"aT袍;m_Dchd;낧^Jp@IQt#[)QdGFcpT}pŀL=274,XB @Ǣ@ `HAXԯs$HBeanre$@ab7ч*E=2ڂ+O]4fEw"C)k @{܇vༀTk&n6[.Cߔ!IoKrg0r@?2߳~kPLKm7nR<U"B".+颊Rxzop"GFcpd`.XU%I=2?u]7]RfmQydj&\a.dӖ&0 Ncc,EH8%Uh)ɿw\zyEhWNcHOnh:q64,1>'X|QgꈸX4zU0NrE3]"rF,vD9nc* -{.4JٳbZr]Bٍٮϟ:%XՖ̂ Jz: 38H\kƯk- 2]"^ QY#ʭ"PehQDH4 >#'hpY?+?1QJ &ǕROM gb2S.M⢡c*M۠(\<68m^7zwD9~%_sݦihRJc*ZE J?)be6BZ!Ez |˫4/J\u~7ۀ΢&7Dˆ ~dheߌl0=ezQ+M94-JE_,|W h Nd~ 锰 .\%ZVbk?|z$^ .5&j34 uV=9H.Q1BDԮ$گ~ц ?P<[,siQǭt~L*U`[ns*x/8OOՌ4OE+06P"p@F:DCg>j/p3rƒڙ^atlt>Yp?a?7TƋzl\[lW+ c7Ω39;Ω39;Ωs*3U,] ;O ,(a7..;䬣'I7g~>MvY'y A(czfl 8 R6xcZx=qh})*0Kώg/7F>z=|(Ӻl# ?&w"J?9t˪!]R=5/]vݻ5+[S_l; )k'fڃbWĵ}d!C"R2Al*QhtG5Fdx|k.ˡAkco=k}Īv=۸P)f3mm6b2'N972WU)ES+h4hea2F9-40qRr6쨜 X*-F ?6CJ 0Yl`߻ (Z}.G #% 3_'šelk=UŵC#= RlY]6 E+<0|?TJ\k_hOS`0)MpR;Dž>"gz$t> %v?y-L]ȹёDʥ L2I/uR XD`Q `[- K}H1kX? ଷ1hLFxL:ީ0&GA`|_]}-8:ߚo̓IPKgoL ^+V'CPfUI2ςdY2Ay0*&Lp7eC/^:26"%|Ni 9C%gA~1U qx]*on 0#_Yl}˥ou z >ݖe_5֤)eSfA% `sC}McpC%AeqSeAށ7DmCpåV-O'KM](3~W;:ּ6x9cU/q~64>w}EoҜ8zsz+2 _Ix^c9ؖK{h:IYy; RϿ1fM#/xic\"e[ݵT puwc#ql4ˈrީ >D%:kh!}b`'.706m뛫=wcUPk:O}6҇cYu9OIm;odQpK-44X4?u`> (zQm+"7|4.pGl2/??g?'W7)g~[0f0y# ,#~v| YOIOBExDμY*`8{6K}WĿFpR!Qۅ[7[po$ϧ,EOdSg૟P}T~yf˒f]ͪyx8|#NCjJ0a6l%ɇ 1 ҜT|8~ǽEm=Jؔb&;ѧ*cͰDUc [ݟ i^{1T;aji9??^Y+dL7٢ Bʲy ~OV0ٛ\8ԭ0u0x/v`w`xDT9T&&UdFqyέ0RD+jDa0Z3rx2KJ>U1`&: u̲jzвkPkN#LHcGavpc&CLHem0`&n3v+$WcqWH+a0,]!BBjgJ*;tuJ9"w[v< GZ]!5ڹrWSP10Çp^TzV藟b~s=~;SJψ)յڨҏ9 cE Z$F8%Ei4o9˾i8&Ѭڱ]O|q&W)?鞱Mk"ь(⤣10"}M_ggQy88k6al5EU˄&} ɶ'~Ms)4꬈Rh^ʵ~f WdUkCڮ֟ɻEl/0yɆgHps5:s.mnNG]4##^v \k~[\J/oISTJǂt^`$N*h (~l x8{1]ی.N;Zel`9sq B٭3[k*o?g[iϼ T' G|^Myb(r&CQeyg"^ N+F52[DɐdPnD'?qޓNX4eW4S^*^럮f%aʰ0<0]Y-  6;j,ZG$l> ʮl"R{ouFtV|f@Mf|nJnFSE|x>SՈ# x۬z8.̶1]KY;$FfAՠ}I2UBz3sTײs勦w>MH#*sdw:˫rvMzhiׂybew,[@ᾉ'i\cҒR.T[&=q˂ߩ;xkmSkOJf>Ek 2;||ʆC2r|h2DKaץV$VJ6MFקV6W֍&,+/RGT?[m(J&&B*|4<1NaU[]6݅]kkuT]JۮCӊPuy&CjgrN3uBV:'$*zBrє#}FEOHj4OI+ X{浔m杏"0OH% JYA'Dž[o0BCyǔ+#HȍKGBq- n1}6}eT ElM2hu,|A}`p)kycȔ(-j݊U,\PbGhy昑וA 9+> 6K&#Rж Q "sH0dn9S,j#iB3 XLj7FȏN-6dVW&`wӒ +G F[} %I Vc F5U'2wPy]ܟ*p9 UnV9_ .Ƅ0߁B 9/ uR#JKEQfIAQbP(ŗEPFaΐQ +%ayB Bҫ2`9 Ƃ-pb-#߅B Wu`FKa94sOK=6;P(&4“`!qId&Dq9(c=Ėy yA0T] gaM'D+9f+ż|RTH[:e'e;z/"^ޗdCyRo1R=j/ *\y%LB71 #9h`*ĺP(<ńg^38%ȁ,rPi m̷S(¼ pS ^1jdYI%`ڂP(<ŗx'EECB +  (jB `Qj8k< |x󛍜 /ژ@z_ձӘ7T$f"-g: nP(4*2hEi#4=¼"GraCwPz4 AB>iƈR[@L#o7 WN^[EP(|Vh" d4W 0UMI[| %xΡ#AXqGyu>Xq^](`^I^:$#9dY$+THdwPyM,μ`"1:`-E|*TDP(¼f318͎N/m](`Q 3(Vx 8$°ֲ 륱K +,|FKuWׅB.zb9'9x,*Z@N %gT{eZl-hƃ n)@o.J0qqkU@vyB6 L | %^)̍F$E=K (`^@,zL QBZH" -^] %Wx?+ % H䈕`ƹKmɺ](`[4hd˘ܽvܫGB_V_SVDy-7J.1勷W?5pp'r#,k4f9xqo3Iu22K섔tz hpUqۯnKcEp 5Ι`|P$TKjK[" tX0Vƹ [E &a.5XoK -⚬WtgJ 8(@6x DST33GAsG[K׾.r(~!D`mb3H?֕B`̓'ՑpWHp>`U"޽q[|Nsr=/\Z"Ha@3S`Hx /φ 6~/fy{3E>L^nm!?ί^֔7PY='z d[ "zӢ:^l54^\ .ee4yWr4\Eu&(M4PԖkc#1tC,VOT`}7=?h8Ĕ*Df֊x2h^9h:$?> Mr0f;E֍_nK}o&2[E_ #0_ Ka<`]*"6FB)wAAqՎ1eTü |1>qsPqiƊnX77Z@,/UMIWgo֯^On.,_Ѵ go X]]\삀ܒG0B7'Wk.zbu|CD oLòE^&y$z%]" O#3^J 03= $10Od$o:\@~r|A3JMb2xdJ 첗ǚ($VsƖKQx9U[J%'ӒTf) izE*H5Vuv%cHOPE 14}+1΁sIN%=g3-4/4MBx:.V#.gو0qf  eSRO堊TE.YC-4 Ŭ ّ?.G[OJ4 W_4 >/u`jqy5\.5&j S:Vˣ^ofM12֬\#\tA|PI2umXM?5iٮ)$4-s ˺Ou|08`JY!? PuGыϋ̼#9N ?ͪ;qzevl dWG#r^ӯGFX#ІݫYv3B26T"?#ܽr)azv+P*x&_ΪLsgqb!Gv$o J5_77|_Kqڔ&|=?l5jr8}e멢~ǿ~N)^LlSKkgBhTh:h ~贽^\e4_цOR  ^@` DMeŹ Rw)Y g,/ay~7Xk>8^=v(XU%: ܶ`x7荫#yP)96CCt$I-?1C yZ>-ݠA5Bhpv߬'g&dϡA3yhUN/O]f5Kg{Ӷ֟eʹoV3 $TaKFFy!ʄ/oRӗM7x4,m^X5٠RMݘ/h?lkNSewlirBF%jDɈԈ cY.TbűEbNβR]p$^kpoy\dq yImqxAu.tt^AZuā\Wi %Z$Z5!s&bbIb.*N_$ց}2 ujdžr0ajXۋQϮ'.r"Wat<'O0#H"ml#b »o٧D۾/pW`ݱ';^lBx{7VO y5np{Ʋ~aGqUV:lʿ[&j1qc`aܭZ6Z3n՞FX'i݅!-bƦHkf|V!;~t'SqQb%,OGNI7=6 ֮=_PU2&gƀ.3x"=Ş8b΋)*0KQ'¨]A{ ;w2*?kw~r@. Vk=ƴ!ٴz*ѵ|[s|%8y(!p$`790(ŨP\0*6HM{w*[ qPO~mP/N?8pe0THT$]džm(^ryN\v[rR +#-sx,o o`{"_ CfjN@Z8NHka '=6 gP:ϩSQN7Jq[}EhJ=>XwbX3:Ă)1ZzX&}+Hxh띇g`;o}n~ex>ںl,mu{DPOQ2e[\2-WT陠q2Jpu VH@z:ݓ֧~AlEç3˃=0 bذOr :\cS^{H{u>$8t6Z&8OܜMڷ/qv`%3<7Ͽ:s &GO .%N=:z"t"#QZ(8z 'nN;8#ف>]dfGXKW"{8o9:4}d;7߀z4XaB5O[24%kVPQP%=0T>.ӧz}wݧz}7}wT>ջOST>ջT'@*&}wl>ջOST>ջGGKt$8弯 C6D_ ׆kC!{*zu @<X0sN!ULJj{79~m>Qj~ZE]zE}P{^1u W$S^"3kE'5 Ed{*/E)y&4J#tgNϝV;@QHQJ!)xHfG28&޳m$W}Irkj`rx!3QdQc{~U$ޔDJ mHdwuUu8B\PpCps'OʆC:\m bI=(e)cbFhx*}J}<6P Gb\E$BY˸"_{L P̜ M#q AZ)հ#"6B]\ySƽQV2< f]جe؟YD> 113,bTg 8$C%\yAĈe-arTSg8!k,EDAeC(1(zs1?pO `O]}W󷒫q2vɚ`l3 Ο}SaLT֣p!~I0™K {~$ IFæ`a6zptnrVr"NS.9z쥪"}"OErt x*gdzSaiBu4 E5Y?!M-Ԯx 1,+MO`f X@N4yqw5/5_ #3zunAꀮ9y=ئ`φCVkZ&3l`"j.ʣ(Z֥t)S]͞ZHoy'C2Dt#qyV2GRp zth(> ,fz:j͕Dhz?>}7IU )OWi^cbyϭroBPy"QEcrߡM8ObÙd XkhYYWBۘsUŎ^ 6M}RyiVTM-ۛg۵EKI`Ұm-sYZӠDqݝE+lۊ ݔ޶Ԥm ۶2H+2qrqJu TWP\q+ W1 eW.϶+aA V'U+9B[:^\!u,ŕF6*v+gW4.6v,ŕ&D&YW&1 e) /JI[q ŕ!t;+`e#\ݘ Z{:Zq*q;O*/%x}g0,y? ,##/{ *wƮ&|FwwҀN~5 I0o ]vXإs**S;˺c#!uݰ>'0Yɺ\=I?E:u؛޺).oz42ޯ(i-]+,q-:PS~~+N(r`Ԭ%?vcs"AְLΘk'P pEP[PW`;즁+ h%XMZڗ`@0U,>wKܞ8X ^9bU24~0JkSsj& @_gD'#SJ)"Iͫ]?GhVLQ⌱c -Au`#$h͙|!r^4>̍ѿ`G$?t7Რ滿ze8kX`~9R3/NY|h2ԏBH([0ӻdzN8K|:ڨiv?ظuO(u0 ! D0k_ZZnc \E:~x&Z! )$QRu^awӔ2 ih59#usY~T3aRhRR8O Qb9uZ?)Xݷxkv*0:s?JG=?aR]sOʌU>aok^Oʪ.bۆKۦTQswJ4޹,kZu^ QsN8[҂/XNH[V':aLADad&*6)PZ]s:&fu[YHƫRuEGbRX10NH`ٜM9{aPZ&==ä',MjSڜM+KWh!ʏөFᶚHL\vB.^)AMK4]) .c_2Vnqy㪭\({,3ڲU"!NGdOÛ|aj)zlyrJ v~/oC/hF?z J\7`(ogt~~uB-=X2@+<⻾3 y= 会Lfćo^{STXGpŁ }O+|kOt`G<S<#:6qep7@Ft⡝j_}4l{gYF jg3ZM50/Άy)R8Ͽ޼})>*dW6 ӭc\B=rκ$|{= l+v|SV:FQgw4=Ic&P\4+_k &C@ȏ!Cg(:*s-?P$)aQŴbZ, Hr㷰: >V.`)vM#W*RZa1;+k R PF0;eQ֏R;d'YK٠M8݄4t6lc.eM3B6P½-f&h =$=yCONc'@$NŚiC{ƚPyA)l)PI)uja\ژX#eUi@0Ϫˇ>O5ߦǤ[;%".HM ^QO/5+ìaQkj 6=1 S.(;QIR Y7H]F,qO#)@l8i֕6yTw: @$\o0} ת.롕5h4H [.v7d1%M)YFhejKORL #;ӏiCۑ/B~^nCEw EmQLa"}BkncNQy̴< D; M.!|g깭8Aj/i-H/WwUOZGeQ/xMҾ]n֊YD4\4 Yؖsq+סyg1.O[(tQ=v$?@'2B EеR0ZH/ 2 HSf@0YhS?nD-O< gQjKlnʫZ['Ek]Em5sZ:ûm./F|]oSgvA["RG/>k 趤3E4S$mI"R2y& N[]K [2l.t) ㉄,ur^"J.jWQO E3P*Mx6 LvO]v RzKP $jo'XneN䢵Wl!'5lD}YfyƴR&9źx/sGRyZ8h]fZn}J oXnIe} kiroTORRyLlGԌW]1Ӫ%43b5}+1 RP|z.|;a^7/BF>zl>72^9ޕq$B;SzY/`_duduӢ$%Y67XdfdinfFFdƕ_Ml^b&󉪵YomIfBR}M< _*)#Rx]Wl~l\Fj_e|uJ] ?"~GoK|/7BYɤ)a+c. %<| f=|A$ST7URr%i~̶83eTv[ hVT@}0܉n9+|&t".r/^gЭi;G/KPsRb;O便eOO7sK_<| `%WG>N_'Đ:AHyE9Y#C-1Gr h$jf b{|͑+űur`-slу)5%ٻVղQjBҁ܌<Oye[胖={ eoA˥3LB@ֳS.ܶ|!Az.;8=޵|lȃ1= G r}( =0PuiHFzv*cw0hI}#%˶Ә;cZ&G06/vL=IQJ6Gc$¼<=d(Jz/. Q ǣ"uOI14Q}"o黻98à\g ,<*T-hcQ.o9)<=\-YgJfR"\_,ו2'ᱼmX/!v^_eٺ kUԚEȜ~s d0>ww_aanUjjfˍNcDott8q7H-!`q&H*?ި @(m[㧯OnLFٮ{v1n 4F3wsg4A< MhR3ǀPGcaSG 2u0Ht&IXK*aHhZϸwpĘ5^ڍCut]52D }ߖٜ_4iZ'/zޝbQ][lZr.vyW;'X.l+B_b{oUG(ՒuJLt}.|"]%|u'ہ'v|nf yqwh3|wo; qfwc 0w$.-(|ORBP+MLo[̷wyuS)NE*:OczK-5ףV +:z!'L ؾ _g}z¥{N01ZgX 1Q?K[兇AAwcÁIF,a$[oW?Ȇx3wm̫o?bnl6}{}?Aq;a/ǜpaHApJۣ䭠0FYXaȜEc7}sc7&5A buMX]E&GBii(q2K#]C66p\*8RNSj"\nj:Q#즦6Xa݄=&$w8MBXBc7! 鱛I}H4& CeO$;j[( jțO;6ڇ&hkcpǣܼ9\S!ɒz{%@Zˌe6NKW櫙~/i2GT󩃱tlt(J n؋BulIB>.]y~2bEPnז&fVQ;tG#J Hp]i$ ƕjHyJ#*]i)5ײ%#<<4VJ8xpRR&[oQפv E:CF 6|~' }`3.?~_pt`?YOqz-t4Y#5 0iq僞$:L%I saOGؽyyŸ2VlC=Y2PJf њE[0O"~jeCٓbe7'=%l@/lLR:e^s^aet$~̎\aA / krGD% 6'6Z%Ԙ )g밈81Bqo"?-"=mHLZua.P)aPZx8Jz ?\IKh(]N02" DLlȂ%b"L}ebV ᛌ>+K.珏 o&uVJT+:3h g?4~.tua< /')pFX*|LxhJa4EX1TS h!uS;W?>}Ćr@Np ̐Rv@0er0![ٻ C&PC 7_3ZƉĉ @)8RAJaT7hσ0YmpJ ~ R\/=-1M#.~,(RL $$L&ASMC0}R,_?_fφmhc-"I^|ݿ9[kʕ7`i7kQjD &(ԀNie߬ZS?g '8EL(foTb2N)&VRG*a,*<NR~CL_T ""1J)c$ĹTRASlMu*XYɘ!S7ԦD;L~G X}łq!4'ְO@*&.r E&:ua3#mB,`ε΅Z.2R.U&ېKR{o'6TzCRru J(!RSf%0$4G9 Hb0a;%A !̓%T0 MHHܛCHsj4줃H_FFQ>J!.,%08@"Ҩ)pˎWO[ϞuB {k5j͙Vʺ YW㳮uEOvrQn?y;V*ԲU@Z,3x=`{ J@jt,`{=nQfUr5m㍿z +-]^5^HC<؅QnxbHFB4vUqRCZpVOsq$\WoaovSey;H=?.u#|<`!j`j@ K4ְ.`L]g2f^AHn5Eo|oC&L}@0d0~+sBjt}WsB@ֱ vow*JPI@$ "J 14R;ud5V}_ef8072k7w[c)<ϒ~KfVӨRuٺv ъ;J$5ۙ;-;^;%k/lg9aRns*X>'P4v'l=DOHCZM.=zB* Wl>z5wn w3ESCmz&P9Ri/sI~0LR[&-lɥL];J6>z !lĥ!m~QWPhV\ަy>gjIJN`6 4PXVHz@pN+L [e:T<}Y6!t`Sܹ"V_ҕzWףt H]!j8 d( gIH^]]a WZG;*QzOr\2gJg!'>-fHͨ`T*DtX<-l>Sw'e&<{2TkNIO-OVdO6H.CqZ~w55F!MOuV+ZIT\݊*{T^T?k j՟vχRnDӕU=R /wJq7巽tk-Ok`1|fV2eS/ΎBmd96۔sV6LҀkT B#Y? - k(UT c0f| 혌21 H_.fmDh ߤh1`;N#-Mbtf*fTpSHĎRHUg\ͻ̘54KfӉƣwY<.fUߖٜ[>~4MO^&EԐnK`*SQT!#?ʧ'e9xЬװw*[I,c && ``RL\)B*mxL#D 4R"i*uh*$JIB0<")QrӉY妓R˂vI3]F%~d3M)V_Z Em[kz-Aj#?PJoB,ZlUncS ȢD/$+^|[L+WTؽӌ-:`oqeJ1.M40F3qeM( "Ib#baub?:tNHKjidq$-!p&uIRIndzG.jk?l8hu[&4B׏j̥Ƅf0 SS \:pM-ߞ{*>Fūpd.鷜 vI%!8s7}RZCcjT2yY[!}b}MOU+$ɡ+ ^]]bh1$u:uZ:uD\B* v`}%VG NhQ8֢",Fi$jH$ ؐfH.2uRɵwvyWF0*]N\\\\*K\"$dVZf 0s0V̱5a m,ұXYSDiN^[1H"52f@`j0 I4"EJi1].pV+$WF]!E^]]Ҕr3$ L F]!|0V\R)WWWjI]! F]!ƇF&RiwuJSyMiE[ TB}E?қF}L3^GH4ٻFrzI6M󰙙 氛Kcemˑٙؒ>lZ^ b"YU_ϟڭI*iǻZ+,H-8#NkjE0[JȜA(^ \ E )JQTAQ$h#W ?\*&j%T]E#t&|>y9312j<2杞;#ՏkrW͕w5'N',IVO :P+dI)tVGZtM# Z由juP9˜4D6Y|H돴b+tFm8׃h?OP;mԈ)e2´YВC,(%8F5Tv]̬mA~.OՁѺO7Ųr:[/W4z*%w>WC|@5 y?,﷈|e~BQr^&iW1fo+^t{}'wѦ$ ٨&qF&3풋ėq pBZ`ǰRqt_wi%2CߩQhF$Dу28̖3ajQw\ۃf Ҩ%s߃BH(mF^-ELr6@a"7U5v IR#n:]@ P~@B)45[:P压;C}J)= B&HKJ՗ -QkARje i4H#JkJJH*QIa7WM BzW\`}D-]D%W #JKJ*JuB*i$ p\p%R#JHo{WJzp%!}|4|߯* }R(b/ "0"xgIФTV?U'{Nf b:_EQbTl]XG)uiC!LPL9iYRq?O[`-ʿv9hNW\IE51y&S{ΕNך=ճG8zWg\.< _7]CSC-fwqbZ53be2#?|E8NIW@['KD/'hk- קz-uO"WW]NsQxkFt lvVU?R1°̩_(̪Saֽ5 /睏 ]Ʋ{./gcDR0R>OQ4ŞwW|󆺚r^tol%%Yۑ82; x+mOQC;3S9mp^~oftջznxn?kt CM}Fp0ۮ0#4OY'{(6;gYzΩK֧ :#K¿38k G\UB(sj*@k>!v%~tB+jk?f`ΦE{|~Q'ZY'Zxv[qp̉Y5;K'.2xvFWw G8vlklf3Y˜!d-J]Hr|}aK*{ZuK8lHr1qڡ< Z3DSBw3q jvAPBz"H ɇst7$BB"į7@{t),s"744f(o<)N Srq"XU"WC_J&*@I\{W`ӟb$WJBt{pfV~Up[bq1o9n5ik2F ]Z50ѶNEB\0ZpbBQօۈH-WsI==z2ˤf~:} d!!i}?I''HnH+\KG(ȅ.PڌS;BL_3*y*p3dy 9 H-X ^z@i4z}ZIp U vMX^ `"ô^Z)U+T+Oc)TDɉΗВ5B\4ir2"4^u gIjU?Z"_CyY)'vňS[mq^2UgϤB[&0(V\oa6Φ4GXɫJOoyp/Oň B'+[+_ecH VwzKbT(,\%!`1_՝ō]\Ipc$A,uxJ~+{`~C£K ϲiq{/֟b!z[*v;7d23:,GSjpPN"/FOox+kp=N/z;1Z]]!px~^Hr$InfGYN;p .DFJ{ Id epUjkYutf:[6I \E5/ntOt.݁Syci ysr4}5Ywpi8# i>Y\pduvS1vzw9Q+ݖï.q7ZL/?,DGRՎbK"){s#3ȠMN+8fAKϙ/ l'v7a!6DD/I)cp1-KHiX#' f,x="Yz.Q|ɞ񀑈ea<>89o˚8ˏov}yfYz;SJud޽s:wT3/t$KkXvNdB8#"Uz)"zJ1d* (Ńh?g$T DḚt{[9y+2Ag>No/?-p/Vߍ!A֗F[p/+2IqEFSBnJn :U6qGoGIU@r~Zc<EԦ&)4")}{#`P1# Έ a냗 8}?>-}%5pm,cB?n,0zN>?OFǓc!Bw3)׶)g[fBDVc>%dq&(,p@-URLD,j4.H-0Ǐ@H ӌTo?gdj lYtI ^MhOd0+'B@)@R:Aq ?ѽF3uz3Yfj2 뻽aHvegwU&{)6|ϩuNB̓Ԝ`X˂lq$_)]yCcطlϾl6xڲuo&uQ)%d D3#2>!LP͙Ciy x=d/e VnC+H"֟'$#+NF6 3/a$K·%#6vq"pa$T BY{g!dWGdw.k凂Fi 6vRT8Ȯ)0o($R?-¾=3:pҒy/FA5VESt^ͨu5_iV='cC5ύ5r6@JKߝ:%ۦпۯ~jMOheSgD-7+3$7'&{ :5ʧd@P<4@%"7"'[.FZ#kȂfMHQb+Yr.ӤT?-P/ B񄤭4,9P䶽 {+6qIг~a6 uf]v yTR`ƂV# NY(A=L5oҾZ/\"4\ul"mloгF1[ijYLmi"vxˍkhv4mF_,ڕ{]e5!ڮ{h_g{{{KD\ s!?/I46Nd]Kcl\#|/Q;t!mEPstvW8clv WzQҙΎ|NaUUߪ,W7u#+6n'YH";dEhK*d!+;Q_(7ӻC`l7 y0lF6tfjS뽪guU}ɫKQ}&ҍ4Cª/c-8V/_p&nڪk7TV$*Uβڱ=$g!Mz'ɸm'" W SA֔;KM,ՙc$z MlE|2ruL (OZS[ۇLeGhP <=Ez]jrȡX0ݼ=g.ζ.ξrԼ7p߹#3%dd\zcɦ׀_=;꽋ST`x3}麻+Ӌ0[ʒMU7*ՇI{RĜo;(6xQ3!Cs2۫fI发2ʟ>ō%_8Vx k);٬2ѝU7-2s\9$Q[=oJrs]sfqQUѷ/U?O(U䉿W Äs 乾ה7c.ӑ|*6j6mT>-!OxL^+%s6h7N@LdB*5!|Xk I|< ,Ig' lwrE2/VF)sExk Y}턍 ǾB4ͳx"c ECIUS܀B`ifN%Q' C8KzAF}߾nsE@i\OSAo: bn~kd RPɸBHi`M.LUV7k7 1jd!mZe2\d& J..K=^8lu(sKpDEF&"0N2D@D}NBA mgnPP`w,eTvpLSᄃ2}-- z=@h &&ԂcஇeeKr;u5"MSx&rrKpy;a>o-m'R1QjdnK!}=@BXP*Ҡg!ƴ F?+u`ՠPmaajgۦ:~&Ngqfj.V \Pbpz{IXvKRHd~2sF0t2kɺFGddf.thR\0P2`'& n(`xRhy[ 6W7qޝ,hU;6葘<#7Rի,(^?nHRJz@߸ݘc`l A( ) t3\b+JB^JZaL#^d.L) ={C\ [;!qW@'; <'ke}b]*F$L ' y.*]F,M p=}K!R!d Nk J=(͆XOvX^  ?2e(5|{ "D r 8ͪnɝPHܨ o'NP#A#A]W/}K=' CM[PymJh2U2D-mqd$U`sﮇcY %4t5u#F2QҜ<v*`/DqB#)qOifl$ [=?׺6/ga[k`}?c*{N,*$)&Y JB гr g_}jY*McmYChI@I A\@Y y ׯ)zf6ku2rGOM=z|S[M+L(@Դ@f00,K`kgHeNRrԔƏ3lqh3ԳmKA\' 'k8-U DȄB y QJ 'ZD"^:nr*%e(3X c2h  .h xyuiQfNjͣ)GMaeO mHbGuX6u'=V&Or7b .gQ-dioQ2+2 Os dzS\pKD$gT Т)}x`6pnTʈjgu> z$(__D,(A td Y&K)բj"saL'}YpNj٢󤾷{oEMlI\w=ih7aoJE9YTۀcT -rM95EFTlq13o\`v١U=͛:eאw8\0NR_׺7Jh~Aw2PeRѳƧK~<SX>7%=77RM]g2Y)σlϢ-'ð+0>jxCE,JݵY&0׫D2_O!&[_g#u5 O m ~]蠔 =q z>JhڀSJ3 j>.bPMN; V*4O>&n[WO 3I|èҫ٬v7-NT :l?G*q``jXk[W] R]Z?Uhgȧ?rzК 1uXQ`28 uވ`}ͦH1"Qr7EU X۠'CK?W!"wYxJDрLp„aACz^-|R=as n4e 1'^_|eT0.M~?<:D  &c9)(o@aMJCQ i6́ *LIHRʄ3Oջ#~7ΕڔZ~T(ԤS }$b͢ha)d<ӊ ZQ$iz!tÄd7L[Kq,cɝJؕ-4 T WKh^J\Жat>zX(LÕȕ(E&AZ{'-!A! &5= &gs.h^TjNrן8%O.H:}ltA#qܹv{mAPɧ-?胒RƴJ@8yV< ` E6ŴWgwo>X2oFYqw''+\L0 xV?Nb$%țdM%\)%Kg $<@{ 5i-RTLFwK7=R.RT)s&-8$5C!;O6~S4йÅX!g((!B~cQk3+T*%b]З!b4E=tgZVRp-PȦ#scdzױ*AdnڈDƩ9 AU8PCӂ*WM?#{߽g 2\Jސ.@ D՝fP,JeuncJO;c& kRFC"ːz~gH[?R28"Y)3Jն7̸APpp')n, ?r}'I~5RM^}+9bNm0`ZT@)ֻ[bB4=kS0 }™uJCvަ> Hw۱;[%{ɔdGvqdrxYo/~Y  )Y 0,emlv ʳ/q"Exz7B?c DʉVNIؔQJKB#i@Y\<^ٌj=sˡ*%AZ'x/ @*<4hMdaX}A6:L"NLyʹ@XV/Xޢs\0|DIP)MOjaR#j]Ȉ,Gkjdsx [2=^ #殰ғd 7a9U&Eڛ ܳxF֓1Lde9&oy^R%'DWe7z5ՐGQS1t/ ~ ~zCf̌Is>hq42/  E͢x:Dns7he[8=zȺ! L&p@KTU₞5Pg*P#0>[AFia!q`͐sdèMHv휅~ੑ1P gCһkd \֚wYEqcbmQB-u4w\XAvZ`qξA͒Ll TTģ'26H<,|~f`D[ÍWkKpB.4T}k<`rv|?^"lo}N%7v#YrJcwQ|,PvPF`'?؊gyGsI^!C52*e2Dp]f[\z\G;j-ko"EYτuT-iw…%t;B 04:s+ŞG~Si52E&HDȘ8D3? /K QII"W8A=偯KN//aќEJɺVk6/O} 16yNtHU?Js6+(|PMPOᇆW6r>rF'R; #љif˵ q K| ojtrFkt:S`zYrCk*#o#L.%AO&i(M<ϓ"f{KI`Xkd ˝k/ }k%/H7624Φ5a\jE{az 1z{2n g]ގy@eӕ?ϖ,mIs& rvroE-M3ѥ*rq=0^!A>yidp?|68n;7,Tb2IPc@%rX=mƓˣ+,(mb|X3!QFFE\*J]&.^o/ qI)Ը*`뜤q-e8^|xRO~/V3y)`1oCQiȭOdi[퓾3IoUS$K.> [tuϭʩgO:Fz@3Gͨ&q Mx=eViDM2#@60Vρg!%zR@۷uK+R%ֺ|>ɚ[N$ILM]L u>TX!rs\xg:)RJ]˨B=Dv\T]_ HcQU-*i[v( ?trkd\eʢݸԈZGEFd9D=QE=(ԃ}/QCJOBbi%?\Uw eW{Uŷ$hH}9 JB}uԘ(ɰ:cE?9ff\#A3(F"_Fsp >9e8z]=@S"CS>iB#+"!?tʏ qHJvINمuI6J\GTsZpX>sDO(>4:jg6g"lUl7j]'}+yQ tkp銣ݢϛ7 )V9qv]+G)yObӨZ/t1;s5R]w,&ċlu}'kTE4K;x_f!9j_[5!9^˶ o^B2Aa9;iKVls:jf#d҂ꃺ%B~oTY枊nKB?c1USNK8yu(JU5<.΢w-y[fse9` OxETd'<{-^OH'P_.̧wS[D O"eiEϐ[Y "[R XLp˛Os6,MT%у~4mR̰ X3z[HW"eΏ`RMse=u )]7;;|WsX+ڥ|l.ISu#o;x̋['x>LII{z)s>9.?2qioΖ1m喻/4gOrQ4 VBЭJWv~Ŷ;='Mmڇሪ.x^u}<%-Ev2ʗq>Bk e>9NT[to_W٬.v,jtZ{n&P\(0?"s9/ߴOBkPf_C gHN6(mx#̒Cʍe[?)'s 4_̗x/P|7h.,m:wGRYaL4w&B$*ڗFzW &Sp?P(ۖѰjTACb*?M| fKV@NFk?{.<["%B]"˾*%9.Z7*JiW&&z!*B$F q1*bR.Ir bG *AY!}+ 7gIe& VfH46U3kho MӇK0AuӜQuu&kY*gLG5 (N. 3u NFEm[Q6Oƣ8_>&R}FIBIӮr8>PphX&GAiS{all˹qknŁ R& <MNRBͣ> )`qʄwO c5=۫$P Bij#UC' ]k'Z@ QXn?D ΂A8$a.hE(jk$|=:2Vgd'!yGmkBkrϔ)T0TL{m$e4yT95z#OK9s ӇrDT8 D\Y׎caCyh|ŝ;r;sjdl0:X}YlÕIu9 e^"!dq `uAk|1#DZ!RYƴ+J,4+>C ңٱDS{ǂI4G,I.'A&(FJRv,:k i}!'g zQ#c"'r q QÁkd 'TW $]15Ģug~S:PLbVESti'red c$>)`'`J·3C49޵6#E0]ni`$!|mkgEI[$[ʲY$UbU~ +B߱PuӺdo)Q"~ONn ܇_Ww|ccmD'4и6X۶~u~(,©wé\•P^mۣxlQT"hqH,o^q,[gHqծ%yAyS9码k|f<Ml̕G V&,Ņ A3/`eV5*)2`6kKi V}\I"Wi_p> `| y1Y[%Bח_q,E;MNӫQ2bv@w;)#]E,r` bN^Zcƻv+W x6Nbx_u [i/0 `Pʢ}:ՒhTq` ڐ+yG@'uXrLގ=y5}'M*N;D>k <2$VVвLiaa`]!"9PưH!Z{=?^7B7Ɠ ' ӷrѤNפ$&7^&m\\h6 p粂Y#,A'惡Gn|.Z8biq% Yn`g0a1aI.PWO5Q;#_c$,2\6Xs58ҙ5=DA=*]'#1<).ij 4?k0 f+Y@7 _9@jݷ@qyo%D(k]tՌfw_ N ]aMg+mI 4A|pN oRRФZ%d06 .=hy6N@o{Wfxa%8]vdǤ=ўzxw+|Jw-̼fGFotr# t<սZPܝX1ky2x .)y .`/S y"0 *$E0TUP+"\mljШ<$MeQCCrmy 0t0#Oq}~|lu͸>aBJ@cƢP50GHFBٝ Sxy+3n Bg"{rJCkg1.ʤ\,BrX $ oI$(E%#hs96ߵA5=l[k|ٲ Dn<}^x{aB ս)HEUz5_&\~^%&%g>ӗX=>b[VX{05dx,܍(cNF{Ҋ=Z3Hg]2*Va#%q/!$H>{}fcK1^5h4xʿ&X;^}y, r3F6 UfFPHD2RF5&C$y44f/*Jȏc|k .y.wr#+# h_f",ĦqL"C_HdW#3=pڃVmO{[9ਙl&vwII_y=;űn7I\so =3\PpZ@Sb$Dn"hlf#3J282| Bbqe2rNeլrS> R?0şχpSpW*XRvd="pp6OYW~e֢kqcBSY7J,_ԙ%HX[^2LT>f8o%g[RV}Q^KI,4b$V@}F d8vY^„"oBmB $եx*0#T,O3 t4 CZws8Qbc\G0,~.{)G/рmS )@Gm\]#I3bD]Ndb0[%x$|U`m@LwBA)%8*軾*fuʄ<3w&؉5Ʃ?/-/m^5z<|P"3ak<,f}B$Λɫœ d~2]fA^1°į́>RUrLxp35'W% /G%;V0W:V󵲋ȓ_ٱc@d!!;XQJcq#3ubH0J3d* H3T@H a Gɼ΁f2 ui_s=!ur$'['aQȒDU gfINb#5_*kj}Dv׃KKopQ~v*ar8d v닁_"g,Zo}& #=,s@7H)х?Yh73c >G#o/8w'l|C?Yf cE|sjK!o +0ޅR"NJVq% H,|P/}SB"EFXeLz,IB$b<+!Ji5f;;}qE=N3C(: dVr8ZUtH&Y:}YΰRjڐIs$b򠜮*`?@#H^rF)d {x4.adf](`M\'` 1eqGzQ%=u e5q͹d.Hbc Wg}%p`1LJ,GFe0Ll>·ׁA X!aPŇv8)yo-RA>pJIt71b|qװ3'ſ|XqIU7v[ OF 1a:UG豫npHyAz^ Ha~Q;޾%ۉ&o8qr a<݉o=f*w8Fo3hCh{̘pbq- xN]R-f 1f_dv1|zX$Na5Pȝ$2Da#J`@A A,!EQ7qʇ#Dօı$cdк@+H)eF|Zx'8GH^uN{F94BboGw陵{Zi)R5y"O??{3Sٍ}Doj> |=\a/|ȫ_͆Y &y%Qf!@KPÙh9RkS.:җN~Se[uxXV(h!"`oUfGs~q0Gn"V>Zfn?ᥲfu8''/CamFkWR֤ܓw͂fOXHigtjBn⭘H\]e6è_e%l6$M{h#Ʀfm>;Ip^ N}~c-z䝤**7j]>).Y=qܷl7޽ sQM{%֒gЏ g %ǿN.A7a(CxJrhA@I̱υ ࠁrk@ Up-Ufg =*w {0(la 3+`PP Xr ]fͧ˼f3gRʈi4Z5s90 LrS\R:IJ4ؿΪ*) Kwȸ< aDԃWK;~TxֹV$*7 $@+Dc,hĽܱzC)p_~ $u$mo89xpOqCf-Te2cѥ@j"g' YCUߚmgUqrcP˒}U뭁؝,gs6Lҗɤ]2ihLTwX0g 54M919B2T[gڣ ,R~yw^c빤!Q,.\<]pa. Koe ^]:Vǚơ׶=XiUIׁm0R0juꏼ@6Rb őc/wpepj\$tK1 7!eII@ooVӨWD nu%*((YW0w9($lXň/*~agABO0@IHܣ*)'nhu{x[4jXإiX9P~0M$0]˰:yG8sZucƟ)7Ō1210])c%%c;;Y2NS[5'(8b1Ui #X-$Rϣ qdT_\_͈]? b ؍~XjO7=pYn~ۨ21U'coT""ژ,u1d&`1IXTdIf"!HD "Qn᳉z7D-!b)aRRqdfFQLE8:D\!J'@ bvո@9Ie [_~QH}vq3-B9ͻP*̨n9 +/;CR<)-0J $g9=,v^ذOf6Re6x|xB>aÊѰW=_C.Gi{J*[: $q(aHA}3zB7g\s條=ؾwem$I~̦!`f=uSD7Id7DJUC4]nnJrEe~qeėUm[NQxLäN2"*KV i0r$W!`ϫ2{lNjbZqk`GK cQ'5ߠTu%7~4D(,9RaAMfZ2JKΧ\VpOs=-k RvG KusapP<"a`AMDjϹS2.gB8MP,gM䵊 -y#]Z$氨iB 4 JY̠2Є$u+?Ϊ;v T~j™Z1.T]gF"BVT2(y2E4 el$" {#Bۯezvz?J >!}0 K뮭5yQ hr"btMv6I$^."Eü[JD@{/o{;ǘּ%|SJ5yƥV-2I֬u(D*ٜv#CwPm!;FDLHi##'H&26vrY*ДyETͣ9-e˧)owrCYnh? gSXt6F+S>YlS8M ޹b6aAU;M5;M￶5Qtwngo5:p;u n=EF؈!F OS{< +fQ (t]9UAދ]O*22[T C?I38E@%BE'qCy"0Js4G0UDnl*Gce͏o2,fW~S,4{c <9eY9:5 /yr\elt 0/YR|JYETH柟e<)oG]ix&4q_Uܬ=^{kN@LcoRF1>=PDݿUQϙͮKʾSȕFʼD$I †okdz%jr}>;Q}*bJc[.1MQ\~5[RL*A/GӫQs}: BSSjnrj&CaBhNC`0yiF9)NqrfMi̓3Dg(L]v6XM!^%^ 1;4sL\o,L2hTh*QES 3:T>W*+g+˫C kBep Gm9i%AZ眉Lq:%Aah*h*|姰bnRu4ZxoRO:LB׸ЪU~;d]~aRwY޸AS墡>O5Z8u)ߏ#(ANJm9ZƦv2-s,yr许}tSYuO_`ͺVAteNjc`((+ȉj3Z}ƍ vU0=$X78HjLi7dAjIq1-g[z8}^=3x|͛̓R3v2"d@P .8GyvV|EZ1݄toȓj/nCXp? nhJ$yLBBk˵oXK%@_wm;N6rNl"u4kԊ{/9nFiz;jAԸl_[w#D$_M՞21||9WrQ"sOiÏ9ن[q*lrus6"cRn8jsx]Nx^.lJk q u2Y-.2 +0I*@Hl#;nˁ,clJ66 )Ad,zWBjb>U4)͠S?y. ?WetCDD0;"[|8wl=h_2H$*z|S/JkPf./f3U娄G[se#4H<c%>J!&bd#\m&I yU"EOqثRxm4z)q@*Y2ӘQ7\)4<$2,zԬޮ@J5x^mnpSu53vq.){hTQs;r]a6SZITGN^ #cضCeuUZ gO?:Y˓ɪz2A FrU BS'29l 18evoN蔪tpC0Nenu9}>ȣUWz#}ȵϷed>\ҕ(Qy!b'QztRAtZzwwt=w*}}Y]VWk~5WR;ѳw>.ojE̝lc0yQX~`:F FgKܔ;L}CMov3wShn-eށg^J ecK#{L0/X%@ Uڍ[DkyJ8 IцI9ʻJ9u'{&񻫖w8<^o 47a.ͦxSNRjg#d@b"DD^ZVfV(CrH4)QY(ʎÞR@`C,W}Q7Dy[[Ѯ{UOeZߘT֐E@u%![ҏ]j Į{>U:0=†ѓ*骃%_&(teګhWيEck1gʾ V-%ʷZALS| ɼR]EXkʑKz8i,,ҙ!#pW  &:;8OBU5|6=Dp7Pwy':a4tLga@U M,=p8mQE?=ZG: Ⱦ"H w6fղpOx$Ձ0UfmI"?jb$\6SQpy;ƨLkZVofE<K%$MZ \V{lf^L+3y}ſF <$GZ$C*_´ P}&"Cnx/6KC=r{ %!j{ijfQZ@p v!Ĵ (pWruu1,Skf΂[Np1.G?UdL\=g#ټd'ٻ.\Nߖs8OV/Σ)ʆER&j?z( `mqۦQ43b I }c(g8-Q] CGξ(zc(hՑ3DO0*Z7CfQ(l &00@e keL`mks/n{ `ZZmVדzq 0/+ J0QFKv&o4rܶV8ph$);ù*Qw8QY J-**AiS.)%gDcEE|"=&6\RDAXItZPA  ූJ 6HJc°7ұDF|\fbN+mK/v-v]NG [\/Fs^ {ORxz;..FX mpt;Eň+!+L'سL X #VM0inp H*͡$p(PU'K"@8i8㛓90^׋gGuFTIM1<ޜ`etrl$5/hJgI[=oN1x12,>xor?>Tr;?wuIM km68O$D.AD$q EJ3+qC̖N~ q2I%sz2Z X8 CPRP*s*B]щȄ^|6'jAf8c#|fsD6sBeX00O!yο.>]6jN<E2@87-Bpk`Y)&(bs$_hfVsJS/̬O/hha>|j :]C. }7̥bc4I%ĘQr#J4yàɖ|uZ#qtƍ݃ƚ(s7àLTl=Gl|lCDJ1z"E[%,W9IVw5[rҞ6k)T=A΍ߜ%FLsfGG^KO +JJ؛("J*Wink &Mj3o8|jeD<&EAz"pMGw"&~nYf.GK 1bɣ`.̶۳Tw8K)d-iM9U؃^kFH撇=ΔvnSj6~L@sܛ)2{3,(M9'6ʲBH{/n BȶЍKÞCeU&hY k0} ֩j?>:Vuؽa_"KA[.Ɂ@p7-)6(M_LBExhu k9 Q ys3n" h) TA`Dl/c%CRG(g?!c z_Ԍ=q^Fէ.y@j0of}_#Qi3 e(Cc\.Q ΰw\%ŸrVň$a 1܌wiB:܃VM ]QB+"u3r/ 3CVy rN٠<1,ҊD,̹, #qtf͛+63casƳU 1UUZzǬ/3l2Ixiܡ_80a}p[hgJda7fenbF[2VM"WFPSU[( tعyo[d;4yI4lz4D}$C2_1[i+y_QK?fT/ X!3=?r_ep˲hrΎNJpȣQsN_ߜ:UZ9[Q P!1mU7->mv@. ˹~6A"i=ȽּO>vM ;&jT)>O{EHXFDRkLcÜJ-%YMFz.0=t"؃8HB# y,b֎}]H&y@UFb衖-4hJڱFco@U9i83GQQ#`*aQg'I*M QFF(R.GR ֎7ZKo~kLGyO 1V d]$ [#or$k|cCoR2O=*r57n{ɓwtYO8r;.ږrMb-xja`YW}܇OIWgaw/K kQo:u1ϮUԒwnխ+\˖ Wu欁)z W5H×&ld:aN'IoOLtɣ$d'7^~kUCX8%G2TJv>[gz6UJ٨yn}HZ!c*mغ;FAKo~{{ [5JJ4y/msjw2\Ӛ%ƣIc8=SӞ]W3-d-5?BGP"Hkcsᵏ0JҽhITFs bj)0'lCi5&^zE{oTDҞNKæ5ބ8D 3^V^Zٖ( Na7X0%!Ȯ3` X9y߼YwvsN8kz͊kbv_|`L[oKC_@aaH̙aEr'ծswoi^SdC[AfϽQ2S#J޵{QEd'Ev; V'gFg3'( :w:*0H=tPߙg[|C9f̐<@;(`߫Wr$<k1T }n.3ñxTX"[UL+Ah>pX e4&3FxJt=[?6ye"{x6y=1Ejk<Yfk^Y7 H!WEdE lq6hP7_F Im4of" ӯ3XϷb$ۘuQwYg9/٢QAmэ-x-4TrތQ:T #f`쟭q`$EkĔe:EbT[z)䒩 F+mo<-`NuR)4R]&<,,&. &fKͼWjK1G~0UPCM#:,9#[D‰ADO4 sv=ų]ݭczds W{M=5ʠ&X)υ[d[n7C}Z$Yw5r$/>K];]<G|hDot"7H{H~<_a+52K9(}ph>fܵu1{Ґ+9Smȡ]ŋx<#ZZHUw"eKEKQOxq9|roFEcm}sP!J\tiFDQ-R>NtA2 f\SQTFō5Egqb1VIAWbU<[y܏ֽ-TEjH~+XW| +zwڼ(IMQ9{>oŅ~v;҉_~1_e6l_/ߨR O1^Yv/O?|<9ѕO3G>]$}8y:hϼ;~|VǏ(ͲM{kA-VE~b\ 7Ϩ]㛻ja .ly|?on;J&6>usx#$brޫl]}ŊsEiVIHR1BYIQt2'l3wX !:• V5lo&Nz5e)4qI=9 bBG5jQR#Epbs#u#K)xh:Az,7߄3z'ng|?uVHnEc\k4Z,+bԻx1h_#ٔ\87G{`)^M@PPz LtsPQPц.Z}N8ݧm_+pϭç|m՗1wSrUpǰp{z]sҸc،)BwӡLmɻąƷF25M!ߡh@ *A+J.}51ؑ{%,jrJSYm>/I fM(u[gv ۭ^7& xNkk$M.|8^G-`VOY@jg&Ј$LRڥHPX{9}Z-Z,RGVIz.Və2I7oO1O*pN"㇩da48Ruk4fM)@ P[)ԃ4N3?se̵uC_э}2G007O, [c9X-ISK9Բ+q1r:բbX_ pBǻۛv* ڹ[rĽ٭nZ"?C0ʮ_qrI #bU  PdZNMoܹcOxP;T\7{*"Jm-!r)%^aΨh̲ vZ[s#եp-+vP@S{.F퉣i)XR+1N/)nt[Yisv$N ڿ[f1h -wq#(rĽAa㉜8Ռm9q%m@ghDaNj+fyB `.5KVF GdUG@27Ƀ^}>-%2Tx:+8H?.Lﮌ xjy5=p S>i;0k llT}ħgbhMrvQ/PN_3n~Gbdﺋ,tND9Ώ9{j}fJ~[uYGmm%wfK?<{e7;t6| Ts8y*->/,M#ʭ'YZΐ 6u<395"9qo#03=hR6g:}OƸ*:]*Œޖ6 6/W6Z2oo}>6:T~U5]t̮u78}Ml9 'PXА(qj۸M`yCޟ g[MHצсGI\lˎ5- _ KB##,,kHU*~Ko(z%jT-,tyZ[=eT]extx~xsY}=[Nj(^iO `R^ms>ɞ֡x~еbsIݷIYt_ǜ3F:Z[uֻ鍫&: 8['t_nF ;Mb7h^t0ۄ&6aֻM죰s3Qq~Yיǝ ot;qo8] V5$u+Q e ሧR4L)}BC'nIzԦxHbo1e@9]lRk;wXq6rh^v--EПp>$\'*Gqj"*NJ)Ι)cqdsfﺐkfrV bAܠa@,m$=7wqwyyTC\?^Gle%m#:Yh0FC!e> LhH`m9 [ZxD@ +_ bm*DgrlrQ+RahCϡ|R|@O%̣߸k5=vJ3x%6ÆNެmPTZ>P_ *p܌&HUh|SE'EMWM c# \͒um9yḂ%u'qw|e{kv>hιs98-ɋ\^8]ͧCy1H1\:eSҕ`}$:b5ݗjtzМҡE!{iBA?נ^[.'IJRa[e쐈RSNs q:o?`[˳F"61_pkyH>| Rׇa<>ϟ7d˘><(SF0p¾W{%0Bj$ũs&rf8 fmTq -#oa|#3n]p9q$itqĸNC7/Gwۗ2|YV-Uu!%c` fdecwMr F$peiyHd)Z60o@K=!R}mO`_0{>~lCDyZ8IXz NBΑ5dxqHpG H+sjѷzBB1gʖ:8L 9 #}5c$,!`A2;lc 2!)68N<3"({iW}hp0qwc3;]nbGĐ 1~d4ȩ(^0qm  iSa( 'urnaa=Ɇr`g 3SxKPp19$Jt ks 3p8ZӂF՛ʖmjζm:iɴ/ U5;:pB2WzL[bmq[# 8ԾҎYk^gߚ_$| ]َ R0'K(Q"!RQ8aT[):Z$_X{ɱC+Y½zxBdf@S8RK9`Eaƀk, _ Նg~{2b.d,YHvi`=dRrKt!4[`a<$ӓ p;?6Vv"* Fh(B4 0FWyE,[yc{c> day[o>gLMRxKկ]``d'7^mdkB2҇w^ j_Vm{r(L3%kSd'a5mH{K!$ _H'O/ n~GlݼolxèKёnp9rt hE0|7Ru NIW#_a^j fm!ylV')HhEG ҍPޢ;9BA/IAkO~zNeqddehMڮڧe@ނ3d>sx"ak jÜ~\< qMAOb\Y(9 &Xz< (}XN" |AۻY};h}>EЄg!K'TLR1A W;¡ݻ/M4rA⤱Q'cɋl!֪Ehh+cR 'ed. v l "xZ﬿;kui?/`cZ;mAy_|8A {솳}p4$ݏ4(UtQG̬4S0̶=CvSR=K@6B729XgL ycjF2||NR{J<W6+k)Km?=Me[,yYA?uvm1Ot#h/\pIC'1t&ǩ5X ]DsJ3|ܥ E8& w_95 k/Jk f#G4BO`_x jy^TJ7[I~N^VT\!.ښ{TK2] 7?[Re'd.Z{6YDK}1=o+#!h&,ێ9.% /]$9Q[JrƌnZ #f9|ń\{~Ӌ**?7V*ý~kDiS[74;btqH w5R E=x4J% X{qhK6Օ%VJZbE_-<'1j|/x59~_l/00ОJb0'1~d jNgF)L֘"4SԸ=IRT93aCsMѕyCuQT,eKnfP10&Lv$VAs{$_&TIF_ߘ{??{1ddh<.K]T)2Lө7aC\ɬe2ojhZK"r#B9#KG!] ]K}\|u q\v76LJFAo2  ,sfDnh6-I?267j^:TӍS3CBʞVH1}4mX"3ӓ.cBцSd '7:nT:Z-яl,#FDoБe$(vMfZ'Y ԩ +~y3F/1hwッXxfNh ?ߖ<ũUb0ԡ%mc<(x0lW"vďxs2?s&"S#w{IKLvF>q#AR8,6Z3$7E{JB3["7Q)b k!P'SK8뉉=j$}R8^bʞA+ÞİmvI9a˽wZyCzb:]hr5⟖H3լR=/`ɫ(o'ODif%1qkȚ'@s ) |o!cfo-8{t|\lRUX2A`JJ9t H?k܇] (NP׀^:ɶq6"Tf1`cp}7DFaG{u Ohܯx6!ļo)3& x4;Gr )1 v fne\e2q)hvpe+;cu g't9¡N M'Ytf>')2O)n^3m#1~KidБ^~RZ>G컿2`Ђ혃WfwbU+3rGI+*r 0|Kn:c5ia>Ak}y>+vW`exay;V!1)B\ Q&%`g&a#+,}\bZFݏ~_'TJBZWVl"+X}hŤBq(Wbս Xն|KXU$D )`u-tYqWl;d]޶rjR tXeN+[^F>l1}S}+#1y퇼.sO<*`!eGpspewCWkMJ.h8inT cX a6.m2C$kib> N+$ų3iy7?/r\J!j3ܶ1Py&2dVZroj 5O蠐ҭcqv=G⃿@_Fpj'3h 4g_;o45f޵LٜpTј;s"Q!bMt3Ԑ#;VgOG!/>,^m/:]?ip-KS=2Zr/GE5EҨg}63 5-(oeLlj@lBzME+>2$GIZhA#4`rUsv.B$le[RRup;Cń&ZEG&?-gn,Kk- _ڨpNa#<))Eg!Vٷ*Ha.qR8Gڋ) u:G1&fj :,Π+I럩|bdVԑ tC@377FD}ȃ\ʄ~{~ .Iw8N|F9tURihˠ=$6^h'Ѩ=@m4b=ݝyQpw@qs rIjە6YIiIQ%k Mlڝ͟)r(I #j]y={+n]k^!Yvuʊբr,C6ԅȵC6+?5biWW* n1`sW,6E6r_H7G,OE R hNNE NseX{Ý+jQSFHd8KħF浗,P F=MO.8˶2,؊KMn/Vk@ylFQ&H^xl\R\[^IS 靕u.ʪ oW!cy̎BSV%`(FNC[Bߒ.[cZ[%D#(g ؑɉ?PcYA\e]#\%b;[.n~,kAƋ4jU#TLi:21h9 S X;(1 qJAwƐ@\fw09qu@BZ3W]q>4FP\95\8kE]T\T:[΄Rs`&7eL6\7<&_~ ͗i޿o's 2;Ez.'gd6?mEWrΦ筞 e>⨶Z4SE (YTөri;QݢHHQB҇`#;=(?2q8(]kPtF΢вuYg!Ԣ-먨d@ٻʃh'4 `ᠮ\!jKbsԅ\֍%&ZbT(9\e]Vi:PՖDiԹV@UauhPAyM.=.9sweZjMU#.u}ݩo 5{*@* %gsU-नC_V]iX5IY=֫\FNM\i}6eVgl ,36sVΐBVlOrAg h̜0[qBAoYC;.`'/'sGU,}ӍٔpǠk"`mQ(bvNmZ@+-h k*eZ>zjNڒސn) b[P9iDʼn:%$.B4mj%$؋ie=I)~j)tBzsNk)U^zkX 8D&(<^ޓJgqs*nY }dL;i)O,쟓AW_E< eGəgbb+.?䅗,)t!g7|*|ZfPS$,7jU(QdlR$J6 <8x][. ΀M%$sAӖ~ uffݶ9֭37x%c#?MGsKqa弧bjqDY\M=ϐJu,.2޺X=z Xp3:my2vRYc䄙jyAfHR̖{V'^$9NntG r#zp,W"zQ-ׁ侜E0w8۹/:NRo^7̺#B7F1h+<8ڡ([>Z'@@+ b<Stȣ[Fj)O %!6r2@@Jѭ 9$~֐k|˸8j}UƷNPAG-Zw)'cKb W]Z[|}a*[fy @r4S-OǷ٧'Oe<jР;<8ŮD(Yb(prͧ3&`w NݬS*Ƴmk9&4$`]zCY'`w.62'\ v44)jKn;`'{vD؞{bdUg7qJIrOk>/m{Y`':-g*ѻq--ģVoN<;[CiO*8A`=5Ļ/lǩƊCN3&DŽN nSuhEmIPvhRy^it2 Smuφ뀏isTpB#Z7uMlc ;6Sv$;4RyN{ribFv4f4O5Lـ&`X˱P%|oכcy90 93ͩ'Qvܼb]v5e1ME>`' JnEyڵ#v+`P"v u7dN;<\J sjc\ڝ&^CUt4U˺׸@Cle|-߀\aˢ0"Q8cj_BD&d}03&^DIF^^Ξ1҅,gϭYWkJK!*rU *r%*U`p}\+kʕ{QF8h4GRL}/I]U}ίxv{WWUV˱eZaTo+ՋmaҔ$^(VAjL׋v4q`Tc!rUZjL7v@)>vNыP*ZQiAyF[FcS"E/:ah-~]bYBB%J7ЖF7dy&(ܐX7gӛţeW T?csY[jd"F.V±%~:W =um/ix54w? 4Ao's<{˚1UV=e4as*YY~]7_ _my}у,mY}39g!Ĉ=Xp?E,d|z7_bK?Jz9ӛg7o^o^ľdt# s?{Բce-l=QD>)Sr&D붖i- s}*|1Ŏ i[M'_108wUn|,H^yO1ChEUU@J/ Q.|O9 |7|O3Tvi.AEr':)FUoɴZo-,"~>EC|jc4{lbUi@ 4P ~2RhCy,*_i}y{ABjcyC* ?TނɁ{:/-ʗQk21ߌTl%-vۈreW)EX+VΚ 3AtQrS]jpQرJC Shp@{(uf螖q<\A> -IJ `LN)_~Kťme,+0b# M 3j O% ]G; >'l1ޮN)RƞBS.Dsrd\9a4<s՘IA__'BڇN/pH[tr&fߍ\HaЋE]| mz1gzL,bA*<9s*Jt+ҚT*:SC ~Fsz%G1ˆ94.csL=(H5,X^YAXEzBqI1gS6 J(Hq (HZCq1x13=qs-X,A#`r,| H<nG8&O)fCOC|_N[$l;2"&nz1g}ք3D"`8@#b)IJԫvݯ'^я%~+o`i+PEA;os$2lJF.%Id>R&pRQP(1>MV͏9̪'q21 68Lb@GOѓDqRsz4tB`xе9}a aW0Mwn^FPvYsg^uKWlox>=FOr7htf|[+CWq3S[@WF7ULƂy4'~LJoy.q^-1~}d| J$u 븰LDJ1()L`)S6Hh*ֿsW?t> bQ6TdGn-yV0C+Өhqnh"kKEVp:_6n@A$GSYy)/9-eB_ mpґ!Ja7%%C1Ou$Q+ld"Wvt=fȼ݂0M^{ <~o, {TtxU J7ψXP,_a񁱴Oo{W,X1U*D)Sa["2? ߺnQt亘)(~7*θZYR߼X3\8,* dC&s8 cDM܋'rTي0ջ J;=v֣oy󕽩Gw2c'6 |@g"6Dݛ u%k  kݼ 6 ~h>odzלXd]Oζb[yvy&px }~r6.CRem$8OjU&Ee}uˀ8ҤߗV/FPm4A';%ƣ>NT^en+JHB 7.jj"WGW^\]yWypwWwTӹK.F@v8 @#Ty0#uQ)pr~SF2(d6-XqKVpnn6& kL `$k`&ZJi @@1En)hwqZTvl$FDgIÅ Gf-Dm+]v: w(-f<,T7#\Sjwu*Բz"PLEIm})$ pfI絔  F&E L璏 5Jl/f(d0=136Ӱt(^Av-̽s8;[}&s#8j 4!V@k/L%|gK1ȈJ' 4VmO6eZ3̀69X Ҡ2nwK OGBrlm襅rA=&8iDh y,ղԫUIέG+`nywz_'ibЊ6fqtڸ~-6|D #Jhu2WJ243pzu w|=đ>s^D;,nw-51FU)^da_/*zO#mԴC٫9.>oAc_#w4{}2= %h i| j5Pp ~n}6]6R#L`4[<ǫl/DFl|3B[3m_oi̼8ݫmimdg3Jni]-[*Iɵ+S4dg98c=2p %&=y * ɕ9^:w{Y3C;ŚB1LKPQ|(۽XWUiQa 0iC C:(_Q9WV kyV~߱4VCN`0S >!Ts:%iBZ#Ӛ5˷bfKfSՈRmzzS)'*, "q18!W%'>f,S3"6זg3b:ay[Q^߆!RF$tGWMx\ؽE3Njqg+> Z*ׇՓի)~A)Q*WI:1D)C-s՞[lT[~?ܛ)G^2YxZKw_b!HXKfу8z[oPMg=-mzw27ޕ9(^6{==WyKis~8ɘ// ruj۩~G]e]e0O|:JjnQjh`A)Ȯ7WFj*xQ9rU(+[ g݌8x⛎Y{pYr'Y|d^CfDn&{}ub}&}y7 >U|Oj""sEZ=^{2XJi 1l$G֙V}i jy:?_ k\FÎס{ht5q` ƖFNڄb1XaPLϲdCfb{(7Y"r3̞M͌'Zk%1RuU%ki$Fa56F4ֹy38Mb ;H_g}lTcsZ*YoКAշy_%dzת?;[rAI41c< 8 J"B'HP>z3kK;,̳u/|M?(>goln8NRjg#:ĝpa2ygΝtz]7ɋW:7NY0hv))Bn$5=+uu HS%:F f!pL WZlDS}o>PÃ#FaIL0Iy~ |H&擳,P~ZCMKWx;%`|k|l.(*w[-?zG>)@=4sgs#0̬wqPNrQS jޭoþ_.g`SQtd~> / Gzpy)5ȝS'-9C_z8c0}S׀4 53?p^qJ<,^;.i;1B&< `_g[w@DZIʝʤ\6dɤ,<3Y'K6?ssuęxC8%+gi( u62!M G mA^ kA/eб+ P!dSa EzƐ&<.x?CĚGҷ P&@ @'JG,c(taBXX[^Ǔonb-W&tG_jjDÁ -CrN<; E@(SI-@8ʐm IНbh 柭 9BL(iCOaUBQir4B;!AБ{(D8ҟ/ t?co[ *Kl.f<ӷ>PRn>90oG{mӏxe q[f<ʣr^$c[cN~ڑBm¥UPV:2*ѓuyJi:dTZґo쬽;ko۸M2jsbN\\` ED^x3uP([F i_q4oʙu`=>`,}Py4'Yd- cGc9"&,!x x I8+3)=uLqAm2 46C=SJ-*KD`0) öP-&29V*v{q&ܽ)M%I"pۮb.ֈ _PSC5d , s"x%@ 45oSydVHLy,TYѷ>wȭ]t [*hr/֣kwa__do:,cxqS~S )hvMRP.RZz*qHA82Pr:tpdO2RpEX+p ۅݠf_ z[L3;}d-nƖGӱd",0挖}YsrO~z?fK$6C@i>x51 7Ri3Q[x}a_eZB:m'5ZBm_MDz#פdmߥw3 FFɁʟfc^-AI&ҙRgRE]Adz$ @\J*+fŹ;@+dntnhN#8(n3^(ƒP֡&*8:$eWG pݳZu0Nr(LȬ!+c,Y-HeqR`% \^PӀQT6=&\hAWT4Mk LtGƺ8C1y&(f ?g%f""qcT1׺ဪSʎ\ű5!0e6t[UIXl-0evӵ)CkЉZ1;مkfKiz:AA7ECH魃Ap 8&BpkURs fF$7>V+,+3v vR iGӧb/ni )ˁ[J AcV1_!.oMZoe.Tz; 75P^ӛj5Oޠ?@b6_h% J3ebۺXt9{bu*}I O{>I7h"ب$Н7 4h{蚪W\qK}ZkW7hP>Wffҳt$~?cΪSit;0v,!CR`(Kjl%> -Eeb 0V<u R`T7ْdzGRT2U(|wD"?c+)%ZHb!SQzX)-bMȡAKƳMf -2_Ħ]W8cCQIV5˿B=Ye.Q?'hsjL.,e,y5 _nv>2.6OW׿t_}>jFWL3ᠬȗZ堬)gQ$ S"W޻@LIIx:5}6LwtRw췏NoSf35\"ZvC\/&\6 Pjm8\mܵĉEâ^`VaUE-H"!fDV(v:MnXdz.+cL,]ƙɪӋe?h+"q}=xzhyUʲKZ%Iq!IS)$%h7Y5g+gW#fJDmtRMz #p}W6m$SD"!8NnPo/O:PisaF<7x;`G37v+"3h nՖ>˷{ 7LԀO&z{?+eS&,k5߬TʯJCw9MaD}:L&b^^z1'mٷ7n|do+O=1:C_ ki Mz .|fG6)Jooߟ&Йߟ>v{hūq W@糶_ nzLV Nݟϣ|ϙ4K3:{SOĶs] (+řho4<) GPoM⵿N_OA"3>y`͛}|Azue[$S|7ëA6Lϴ8w^w_1޷# Wz>b_e-9{o#~YLwA8Ո!$T?u Pn砓5t,|6>±^I&7w0{G'5^ve6Pf2]I?dm.٩U,;ْ n tgi/˺hH +&gPn>ޭIxV i8ۓ 'X͐'Qj:m8fDIUv a(ny1 MB 6 O 6lJ49 5`=xwϤ~#0 Tjtӷ(R?"g3.V;ϣ>9٩qzh,5e?Qp @iZLTIY`uVJuoKn0N$[$]Lw!E֥B:ko)m[1BCӷFh҃KIJ}逶7K2)%x Qk}DPdkvS\!O hb[E@< c&~F,&77&&KFYuQջ\00e$WzHGTX΢VOgS9up3f,P81m ǔY8DWE}]_ZZޠcW@E7~r0›?j=Ao|mAHxQC.db#ťcLlML$E֢̐jP"a#SyCq/4'jvOW~Ǐ:1sG,1Ə7s&G,Co .*c{Ivm)b)oH,-5,9#MGhW&7x@K֛d#;{Y0mpXTj'Orp(Ǚ$ i%H$u+!IALAe,*Y4AlKM8e}seԎi/stRPWޔaXi_E /φ|:OdJ*p30} ؞Cs٫\!O8F;X524nE`_ 0Fœg!Wn$՛baDӁg0^hT\/SK. ; ^zNܾ9H ;.D(vx#>5њqH;V ܐpHDCD|ǞN GlD~@ ??꣈U͏/KÏ2;{_.uK-[FI| (ÀNzZyŤ(ak\49G+]&1T3Q`ϖ9V6dP,Rb-Oc5r$밈%22:2$"P'(c"4DF Vjczsi3[=[$/C"zdga$(ӏfBQ(ͣQ&G)+zn H &E)\߅ݡX@a12A2>joES4KSɩBWv/{p寽MBV6,!a?``l'80U^^S@3,(Yf%gRp\}6^Ota j WjuJ0!]9Hq* iFFpPo!q8V,GqX*Joh,G}bjubb!p|[L2Pl1ŔmL6Ŕ/~icʷ1ۘ;coWB@$K$F[Ipb(!|IXЂe_ ٩<˳!!&è/gZi&E.fA+fش] SLB0Ӭ~(*(IeA<. X_CrP y\Fͭ-m֨9"f`ckԴFMk6j6 #FMY MԌ!1ZE[JCȔUjQKR% їuF" ]<y*\Z$FjU B(1mo)R'(?œ!GnLӏi3a\#E]bNF91 Y1& fC0BLB!n4gzC`$^ǻqZǖ?cǖ?1?RYG4)0ĩR(A8y4GIV- Ꮩ (KuPA%?iRPi-lYdI>mYd"[Y4'yLXF Hh btRX~\D F |%gʳ!R}LV1a8m&v@B%FFF3]w},-#F)F0ʋP[0 STli|Ka4-/HNa(1T"c qx4r,i|T+ z| _G'-.;q"R!ݖ:;2Q#j:UFZeSFDQ%,2`QDD9*JCBQkśC11F 6EOyYL9Y< Q i5NH/v86Y@"e8sSQ*8J:]W7LDzQSEllތ}l]/?4{gg^^RuDo ΅}t3^;$R\t]ne T,QEҢv8'(NqV.RP(R>TDl|% &$kGҎ/_8 0D$XH֦T(b$ `kFPMval[#3yyym&.!FƱ54(%`cbD:0wwTk kO('ّl3@`2xef (T/viNK1:AESDƊ4&JKFQd4Ă<φ{kAC5.6M}TV/^خDr:>fyV즼3#HSZ(my&% p F6CUm5yU7PR;3(;ab]Z(nyFP)딒@Ar㺝22[J֧ϟ7,Q&Ef66TE%,xtSo/C@"xl[k$|еeRX)kwϽ@鞽z豝LzW_zL"8Λ\ %HMqM[ .4hp('&9-Ŕ(%ds:!p7f1Y母qr5 ak1\^4fuq^vϪ! *? r7ڻ.箿@z^y:λ߃Io?o\vn箜vtOQ?ej.gnp:'uxf5 .h,| y瀻0ܣFW4ṿp!G 9w ?{Wܶ K_ K!e{nsI|-ռZʒu_HI HR\t?===3O*5L/c͇1 Wm VB^1JieVi왫ժ>X Gx\cI /0 x6M;!o] yB>E}@>ҩSRrs07_'w0 _#`:?t:uGN~S$>y5WpeZ>V}f_{s #miZzs0>zhȶ6Q\ DE&4'dR?F&? Py{aH'CvL{J:fCaFPSq[V>N[0ϫG]ḹ\'"RxS䵉xRs/k-bP=HY UQXTA miDڢO?/c;G':FBqF?#a:Fl6z)Fy7L$hr/ϓş'?O޴&2rt6 ( -2F]pGb vܭK,h@6AEGhԗOYxȜ$j>{6ᚇ)Y'Rr.~Dß;(wPOs3vD)'b#V;U1 J 0{|0VnK\yY*UCt3gqv݈~wr]]ڳţR옵l2SxGĽ;(dDWq&#M]Y*>HՠXb2WTZB͊S?N?]Bt%‰qC:"1VCL pzi7+`'ӬVըVtmĞmE I!mNMpk-xo3$]8ԍr[&bXSLф֍0@Yp)tK0kW"G=B:D3xcTzCZsm϶,JYƶ -[.l{#+^'Dn5h6y!\=5tATQ$HN;b$h@nϺ!w!Tev}v2c>Ym+}8HL}A[kzVYє<m_krm܄.B;׽<2AN@5Ev62+.ҙ%E:utf /,$;0;Uƃko.@^P=TA[;xn4V7zt׿:.%Sc:\zT}~1U"VH\ૹL*wwa ,DT9^iLo@ެS#Xn 8XʜN;g4V3bf|$ [G7"ٷ[RC*L|%%kTV .2aBQ%"ZjLDX) ,h,kFI`<8E GVi㚖1DhkJaˊ&*K3#,GiA֞ F}L(J$w(5P:A1t,R#x%,n:8j5^1aT8wNڨƽ0"j+"ב C:jL# '-WzBc C}#[70ifjv]ۭenۑbeX bEPmPTPtD*leE,` GhGmK^Ya^yc4ւQ(!@~B S$Ob&âHgĠ1FFZ.SR z0r/3k^D5d|-uٯaz|gD}2(G1@f]Ő4 H"qҪ;+L|nAY󠚲򞄞‡Ć!*C%I&Q pZQD+mnOX ƅxH7k}C#:}ṉj` JC7K,Dg9w8[,&5]@vj|v]=}͹VJ˪'Xl%!9RJ.E\amݽ1׍ ;;QLqSDAyJ/3Y .Ёbk5XKS-;wֿDJ0 7VV^bjPda"KV "_( $BQHb5D13bSc^WP!ח@ hkM UgwoVζ&< "mJJaLԤ[y50 1 <UA}^gUx~f__yƜp!YTy&AEu3١ RRl1rAk Khv噞QPj,E'edq IPFH[wY]wOlwS3G3zL)hzٻ$WrQ/ 5KKlzKI=g~ՔF$jDf3HzD=L3=7* ձ)\T1=1 DNX3yRV=_~>N UmoڢgN)A8$X)ؙ^Džj{ujA,xL5 > G q8;pcb{RN4h3-+޳j=0{ll12c;-F^Z/s27`mM?'*9[OK7$i{|s^?no1,hѝ`;S'30Ys7sg׻@<)ɜ*ZQH Q0]ޫv.<ԼyOqu[==7d׀MwfH6l]RD W37%& b)j+kNGVҼK)q"Sk\[lHhu7QXKem+ Dh>?zWf󒷳bSWH<-w{4 #MZFJY[!cѝϒGx%ĬMv֠QV_W'<-2h ҭ+娯:@00Ol%%m/Vȟ ,sZyQ/d۶ ]BQsr9Qr&65h 3ìj鸺kj䋎u02KH|%?'xgZLjrv?9/w%wj|)"}2TCW}/ÿT}!M0+[̇˅!x]=Gm.EM:54E\闖e !D{#}~U 2@Pл 4-DR΂_Q.Kp4*7hv|jxsrM:]$4GGC}>Q`Sx%|nA:Gk.\G,K ݟ«}?wcZ kn%nm/" [ ^9H9\?nU_\5U\2ܨz;ݪV'G[U歙L/TVAe oì1׃j6# kX_[գm\T[Ֆl%%ʁVUDCWgG?[UGYX ǖbĢN߷==o|XzBo&9atJJRF:)ClKcl],iM*j堾= 'cg3$)ȨAR%Dj4&(cQbCuCtke-d gi̪ t]e]HMZ갎vj))SkR&:],R8 a"q5*1XQ c I[Y<}hkAXOYH"bP<!_fKP*ńIS  JS%H,yṀۓ^̼!ɖBi]OUs%$ϏvzqWY,lO $k?&`|GMy|66=' ^Qz7g̳ٛ iض23)wÇlLɧ:f6ӐM4ԨI^qYs-L}e|ĴDUͥJT7G0r*? )U8IJ1>)O ,`q0LDe{Vm39Rq>$_ar /y9 +DE;ybA"0z);/w>I 䡢C7s)4Ջ-MpYujRa=2QB6 yBjt`Bl J/gy<<~^2-y|\g ^UhjaQƭwhigiQa +"#qbRƹڙlv6w?~pYfAH:X~ͣB9+@eJ&pBc&mtrYDQHYRziB2yVsO~=js߿ft^,"9)-7?ZBZ 1EC$3ڒ _!h.uoV" W>Q_ ىJk,("W{e ZA+Gt^u&Vwo 쭝etr"rX7`)B)ܗOH4MP))m=\_XSqڢێ䖳cwͅ;b%\ʄi$I8*J,'8,ʳw*seZ  ~ (R˗92Lz3w=`Kq]Ȓ *E#q=bw"(U9By}-PJ5%S7nNTDFzEHd2XP+`02*tʓvEXQ\V/(,p_۰}@YV]( (mG{1LS]<_\/2LkH5ƺ^ c"pΜCkGLׇ} j 5_T$/XC R1$&%$Lj$LCc(Z wָn /yG`rn77pqwkF3qk!t[ar܇U8M׺pqxwXMpSMw1"iEVO3[X̫;fu`8 c>"\VfL+6њC#`qۍl 1ah.g - C13X!ԤV#1 RjU\$ib1vw/͇ ;O3~ÃΓІ&[P7r*<$#%ZO$D$+Ãc\iNץ$,2&(IS)KHXBwdB % ԊXiSDQEM'd{DarZfu$&fʅ uw姓wX]󅁢o3,t>Sj.q cԥQ.&4%,%qN̒d)-#0G  ߃&_gOKesh_[ ES.S󔀚5vZPN1uS4 Dr!X9$6}H&K=^5נ ES4SCC@ҹѡKSTP<\"Cpp Õ l18U;ӹ*rCOX,)*W U0`8\1p Vr@|ޥl0&٫`L^aT)P?l k4~[.>ʨVyP/}`FcRF(2b(3Dqj ~~\@Suv5x:o|Gx궸94x͟o&/l,!t 5KrW!Hw( ##b .aDgHBK 3#%YY+B "{P Kwk8>[fҰdQ=4.飯wf]6k?JHASo^ mپ1Ca.z INh8})P \W\ 5~ Vx!V S4] f lKLBn+Čj[vW4xi{{Eu8N{ pt1,5[c8ʙe .yT#AALFPZ2*CAМf3lv\o^ _vN%sh1G 1,q%RDst9I@ka4U1Q`B-O%& ZSIlzW(D 7`~+۳ukvZzZP@5tUUwG# vϒbt<_=RK>派`;lMr}/'^u>ZKs [4fCV .f9Cr }65hѡ%c_>Րb^i'>-B}Ҿ6ui$kc`eS)6!d ;cX4#aUO0M_?dଓ|mt^`Z-r-S@*`RME%W  aX(DDYiL)F DI\e;,JqI%Ej#%0NF/ #^ٞ&`r%ݕfiaF*MsQJ(B4bHXJcj%VhQǢ$:i8]#@Ƽ.LypܘwwH?L䨧F?tscՆ cVgV9K>j6/ePsrLmrM =lC87\q*4\q* F h0I"(VH)G(5 ٻ6$,?@p[{`7"K^]WLIGš-q3S}OH|XXsx;> q 70٫CMF@kd;&]Cͨ^uhiFӖ$}e'8:m$lF11A~{TիvWS aLT芃zt9#ΖӃ=XOpAr˳vŧtpM]\S=fvL7L4p4pxg'[xC<4'G$C?c.B,^#&-˫1|q/^=[E[~}^1֙ YM[kXWoӣv>퐿[}Yb L,a׍BFv1Wއc|ǮFմ;-,*-U/)_7Go:JK{Wɰ{cKB%Y i#lMHΪtNwFO@@|nb./Ld1XpGϳfyR ?ŏu@sMdc˵afMŻ;.`2[sݧF:QݸJ<QD=q-ʻUϊmFm^Z=$T} ayXQ kzE1uW#гGW9&`eY'xBn0tؙAsـ\ys\|"eW ͥmOq2}L!"Gʰ )ZhM:@"m&ms-)稉6Fk۫N=hS|6nZ6-m G٠@zo:8N&ZkIriZCfo1cr{rC\'Q ZmTfLD "7: -Le+-m<?U ׇ9͘l;JiG} *EsJ[p׻LdTۃ#JRy)nDTV4ZGX xDj)~ϲw~&#Jq\Y$+~YwE.AF17!R概V ]*XUKq6`.h8&u%„ޙK:YiUƗk23=՝vs2Ja/FV1QD-[|E-kf)bn iҒ)rŜHCFGGX#d"jr|U-yfBF{#0kI9y+`WAcJi;*[*HU.!J^W).ܺ >\*/0ga B?*_oT`QƦ}ev"cK>+b`j)I}RQZL&j][@l桁 Eei*!WeWŕ8Y"@'Rx"qhUX8p>.US2Ԭ*EA+`VNG"9,IGoogI[\'$ D=\AݸiT3 --z W``M64ߝT E1jnpk*FnH!jK8OՓGk*e<)ώ܎4 (zT{2 k8n@]\Sh=TF mݔ (WRcQ-!H`!H $SbJ3A=rwco\1R]aT ?|a+pDtvM̀57(;0 E " dZ8PrE#HKCGw8 a|VΖ?u;h¡o8QQd`Ňx :],uӗwy7zŝ[gw>m+v^Y>J6C/ջڷﵛߙߦieڽv|z=ew\?\Z??}wr8+Rt_ }yd s/| O㳟g'|f܇OοY[:>Ñ[Lg1[ZQBe-k"uxQj[P*K;xJb;G+Pwldi^=QJUnk:d(sA)ZD @3w{7;+@>!d#'g5so R:LxڱM?M {iaO&i3zYA[K7o?}+g/C?*Wn\l)qNuU˭:VK1@.T9~im_mfȗUc?9]7u>/rlq-`kP1Pг܉ *2 ^y}u6(ujAg&.m'kq#>&?}~ni8zgryZ&qPM>gz .Obs5gɶ~X5\yWnEro=W41}aSf &f7Rin?VVh+[OF#]++kf,Yx ƅw|㦋pTj}!֧}Y*ʓy6JKE<If/GU4L˾-̥Ėo*V|hx EA#Ӓ~, iIg_t{zqI'RW&礜ؗKZMb^>f<=4G[=Y>mNyZ>؛M僽Y>1\݁Q[ۮ]~凮ʕM({93޼lxl#> \a3>|? O]阷a܄=k=X_;俦GZ8]`Aqcnh{Ι+uj-)간cXtt2;}|n^8Z׽ /^|b}ڕ9o^Ygr2I_B.\f+zqqه9}fE]5+LF_N?֊Yf1p6+c/r ;/ޗpx͛q+g?$|%rzpp˫8_7xZ^+Uew%%LTRH!cDv]Z-r#}Wg2{i6lhxԶwKK?ΦxG#μ}`K9EyC+A'?JE:NӿOI\{[;j9 oGa9~^$|0'Ӓ.o~9g3rF,ɨEXޭZ6lKMNz1Oօs'ǂ0LaE8:( / ;zI怜I/C/y2'f ŴO e6- `'pð(y]9=D 3;i \|+la=E8Q&!7a@6jwld|(d~ qtWW o VurSsAsxNVpbSm, |ϲ%X =|m,~,[Y 3gy,CYx>V0=g5ږ={y,bzTN/?iPR$ՠMWF1([Q$yP]BQFv)J6E[5:ikMQۏ!vƩtZ]ɌbQNϴ}:6*?Mxp5u% XM_)!WeISVK?(sLMjCe*Y3(sLy`deo9 :JVJ clk0V$c[Zdo0<d-;=͓li ,cO8V=LocXAp'-ӆp5:g}%ʣc]MWܔj)u9%zl=Ӷoeuyy4|o[k&4/ׇ/2 So|q."w?w?0[/~9sW>8x28:7U?=Bҽe޷s>f \>8I@;ױ_2vqVk(9U6zZ"Z3hG(+Xi\Y䆴wǃUm^$OHHt.J/1bV\.Uelu ά2FՅB2PqUL$܅x$j09 gS5 ^SjIjoY$GWz l'K:{U6X(eSjK*;֖FhJƳS\ f]܇ytTM9+Ud`E[KAq G"J%GR ܒSZ(X0`/%G:bHUku&Z m # v *6Z;lrRI`b*6Dlc W+ D*P,4l44@fpDZZj:?_/U7692Uʑ[UI҄ UռI»OB"pFQuҕd* ֵBdxUԶcDG벍UXaMT FKVCF촏")C-89*:Sn;E0F4ް'[r "du]jPLQGpօJzwZI9*m]ӑQVՠùAnPDs/FI tKRC($ QpQ[$RSm,$EDO' ٲ2I'$ȪR(ՙv`tϢZYA ؙna/>dȨ|x)6xwx_6zE SʴKZoOa[1'h 'o%&L](B`ZҹN*2pڕSXRNI3"Uap+VֆO(,!"p.;aCVBjV ;BpP$:## 4\:/%@VeC)ݧ䢰N8 #aUVY8K`*"i0c Qm+6ʳեQԞګ0 ;`lF*q2n@B9L8ڈ*fáZB4CZ X?EB>3edmW)X@ B3ir6p̝>J<]| )Hȓ2JA퐉KqyU%,XI٧8@!eA c{0:[ZDMCTU w.Po/?! ɇi[د\\ۘI052qPqt3(D6ATۘ0vנAhPfau\)PkHM쥪)UFs7Q&a<ҷfϒ[ A x XBt(39P.ZMk{ JTAJ][oȒ+vR՗4X,{0sbXQN߷,Z$6ɦ]!)EZId Rؘni$'Hp#4\NE1NI*|\;sJKDqH3YNp:jot5:]Ҳ2R6.B!31,{sgr`[O+HZsm?_Wԩ6,I eFdc)5ftcӗ ܘqMc7uzH+gi&.23\[2i,0$RX)H\88gt2y cs>$ YrK*G5M"8|,#B[?\2BtWvNA T60.y;AUTZ VQk=@[Eu؊jP\*CVT{ Cc=fr:,ӕ٤xr3u.u8*{k϶:gXa3WH pYc&%L6(2T2{86&1eIR ~>}qb͞r QuEƙq%1@0WDilRr2: if%Br xh˜w=VD{@a*Xx8Iu>agՅҿH1붰nkR.+ ndܪK.Q%z.Q鱩yŵ-O1UKt 7ո Jo+q]Deoq]uzhQ?X~w9Z}ɜy_3HAEF13D*S9DgV5OGr.2g2d5QeoiWDԿVOevG&οW0wȜFwKG2FUWNWc2ɮҹ]],q/~3q~#]?ݿΞhzދ2ޚ"c,0֋\8HFY`,%7Z$r?2QXҩIc+!dg*Q(Y5*;M[qN]~Z}p9xg.Յw()/Jr.9(sڱkg(XqzdfdJ62VW Sq^.Ae`-2jfj7,fc,g(U(z`o~[ VtߢTT>+!8[2Q*uB(U(zu+Y YA?"GR?y+'bdXITFfBuQ gbr-+1oaϰaOҚna>CVy;}+: /f>\ Roqb \W uͦnkpB\ UV+4+T7\ = g?wWʥѕeUJJU*q+ƦB*Ŵ9H9\>#]'O$4Bdq1ĒaYY+ 3D0#-n쓕se9gY k!3 Sb ByL'\J҈ucٗ#F ǡ"Elq{8 ̢)]6|T7n4ɕ'gH҈(\6<cWV^N^_lUg!z9$ ?ǀ$"%nwו(jhƜKNL 취.}9+ ]Cx,o,u6|>MLb(0(zd8pkbݞORjSٕ)}w9:\ (y; $OJ2ޕػx cR>Di ng*Q+}(h^SN􈆶c \,fj݈FSVū˜wgru//w3}kT!"zuw5I"x?dΓl䏣6ܞ$# .|U/MZC b#&/F>1 fcWɢDS n1Y>u/l qbt5{IBK%8JRo&^| ͳNGqy;QGS$qc.N9dUCQX5n5-VO물ڗ3m  7zVIR\ D{܆#3a!6_>NRw{ܹf_lgE'ҭr ]o$(4U(aܓX1{JRBpQ(*Ak耜N+0[u&86Y|'B29 Ɯ/BBn1 9;`&TH+PZO1:`#:Ls@;>?Q10;[>j_=_aޕTB{HMr6036yR]%@Sqﺯ~R9o_HBnYlU-l,/Jo- ?9{\:# Ng V#}6jq?dwhL vFӖJJ_-|JRZJ2`lEh,YA{Tgr7'@ Q} =kjQ(vz"Z%O ܓn]C{te/5?Vv[wug_k5+c%m$&eҟj/Hbb5ׇroijXFF|:1Rc/gn8%yyYFlO[ԅԲ좗OHG:I~DZrƸ'}oހ#+ȄtY!Z/E,V$n[s;Zwkz-7taI ƅ3钫 Og}0[lU~lu1>򈮌(]ZmIraZ5z?LfK,=X{Obڻ;)Hfb2jxntb9y`F~VYHGbP)WdmTTgE:iyPadI٥0{ xC^w mYT\1o푃GOSojs ˡW<<ȳy;}k:Y#FL"j.w\F/Gd6.j:)o ޹&;ोw{?rgCI%ߓi`"bm䱃˪Gƈ@ߍdi6mǜirT 6G/&^9r|foJ#*:_)u<ʊ˴A`ŀJ};}(rv$:8AZY!C,BEyэ~h&95?JDkOJ $W:ckrOW D[ƒ(m L: }D6Z͟Y{  ɗ69H()Ј"Ptk[+nhb\L_'6AD}J}ETr&eZ8DjClI13&c>}ٽp,Xu@b&dxu?J{w&Zɍgw6y g I&Q8yjVba' m2 k~'xFՃ'n3FKƨP[ݏsF;SʄPVkbeu~5 _ ߥˮk8|Pl}͵A δ@55h^X>D1t-ً𠄓"TmՁ"C{N{| +שgEtx-ȋ[\I%uVh_grh[wdF}5hEhr?҈']`{Ymg2- RsȜlmU_ftƫ x;p .cOe0CWshr &WG83!kJƢwrj qف8O2_̌J,ځZчL t $ǏyM;kLRj$QzQa]aVp"(5@v.s^xAasnrf q^q. Eh^IaE aWB}N`Xb6'>ȍ2 J@5@w`eb/Y*zpAR_ƭ4(Dc(0,LjG{@pA0 OTh-@=LwӗIQlz)n+^$z͋'߭`\T6"q|࢑cGc [H+i%Dd؅[H r?qwb4 'G?E r0P dLa5=頫 rG!`--7԰S*= 0'邞Rh'e[%uE7-(C;܋wFiW Ť"Nm@ڴ̠in#B(28k2cyօH#C.gm'KT4jPK5Tv64 aԴΦ@6ཝ\:7l<ӣRto8:XICP\#ThglMV t*awapTc*ɣ_zSdW16<>ulHk~9|N 6ȯ󫙅Hrb 2YE#etWHFuZpcAO(&u}XIhE|ԟ/Oj%hWE>josR_?[Ԫ6{a9dN^gݜBX(Ų_(,:3T 7~?RW/L'Jk+Ձګx9] ,S[# P(-:4oFqz ĸ^LJ@g}lU*њwVB%TO fؽE,4=no>2+E+sThkCz6Ht" Ӟo,Zv>8LӟwtS؟h/<[hRBI=V/s D3s38n>S3#,8S0{|NuK"77 _?۬[TN%N3X u2F m~y4i]j{o';r1řHUh:G۬dUM(m"ȹ]ݒ?'_B5Up{sp)w]nEMvE/N&Yr< b#pډԴEF%mΓ}85Pĺ63+9IVmL+[^ e>|s~oi϶rmMŒLlH5wl9En_3\~#uB}_4 _ n jttٍ0S>u$҅$"d:B8 6N阃 iIϑLMSͤΔ D B$bRakv,a+$%#p:W)=G aQ&v:ͪU .u,N c$aBUxCv:a5'ԁk#a )L"-+fcBQ2Dǔ0KXצiYe(~iV*FCHͫ@T^ ~v`yg,/85 9#T"&9AgSaL+aΦ;p.pf;e\.&[}};CN=-ws}"mOx}q^ DT2|P/sO]+ 3,o .\ _gWqI-~Sm绱3`rY3d $]wrQ;?^`-H"՜_#)x2򮽳d 0,{'uCOխtZ&5ûSZ3Y.nm>n߰gŪyҺFA1DȘ ,u36kw6 w41˅ ګ"+KcvQHmD~ M[Cò&mQlL٢IvLܮVGȰSx[.d<&%*$Zzq-9ȒjSeW+Z;U7c"pGBg䢅b7R O&-z{JO)5\ IBrySÅݞ>Atb=أ&KzY9qs8m>4$(g[n^8~PJZ4 ʤ[S2Bڋwe `YE#`ZR R* ҞOF*_(Sߤ!3'^eo]|e齁V[%ﲥdSҋQ<}QՐ؞dwyΈ$!u2C90&'4s]$֘OتbAYijyyNY=../ϱHO}fۋ7>11.O@LnM,0.|-sTqcc 9W,$P-sNa@nԢ" y].EzO,gvK)FlLӚW}y9p/66sh~J!hv}&g~|,(t ٢*cQ;EPyn; (HIGx:/ )`,oO˲09%_?_6 *;[ LP ڠF306(cF-6Ō꘱s"aE~9*A /CY&ؽ\N;Nw`'jHt8iGc JAh$L$ ,Ӏ`Ui$#URFnJa`p#5q^7rRc]^z]ܵ.0g}4*!z=DHV{D"Z<51I!nz)u\^5x !TEw^ "BDP=?/;~0W^>%'@<gMg?-*_!ns{?|;}5b pϼq3)% Cz סv"NZ6x2JE4nQR=TQDRɂJ%£@%p#N *ލhم23Vw6dwɿ(Q覿Ź*2c(8q4xdE;f˵݉q~v%ieOwD-bh,`hS\LJUy(.O?v?q4z`0&K Bz?@>[B*\s-Z׭=;+1[wV\}o%H;̧}Q,rO#}%鋠Tv,oNJXcPe3Zh1|IKqc6?TٜƂO=ֈh' ޻]L![^&/%;['YwdrXu , x.s=v?b {5̡e%(Trd_Pީ-Zχ.P~)Fi㈊37{,Mۆa~d^F֤Ħ)c[K׊$(i*d 7sg/Wa*+QM2 U֜=i(E&HC̥ VQ)C)޺*֤h]4fAdY0F~*֠ǚJyӡ;.mptLT ;爟M\O>jenx?BA3 V漹!X2/ᕖ=at)=zl<:عgg.\In]4zwA03`E -̧>I]p~UY7T/ͮJٻOu2(BhT+"6˜3xY|)6vm}_t>e=Hӆh*ɂCL`){5f\|E@XR$w)b~h*"iJ'hpd3u1ՌΈT 8N|b I"a0vRch!bC.x=8v3Q+0s&ȌݲXߓ[wVVݿuUAY3wJf$j+qؙ.&N[ !/]UA7a?˿"bw Հ?`M|AEG$9I%*eu$HR||PPԇD &2I= 0 4ւ5dʙJT";)(8- =.iQ/ݚG[#p+>Rjw>EYzT^,ܝ..nŪ͵KX2r\HSD-|AcK'A]bB9t=ԘOMQJHx\- SH쌴u}P;>;;A#yLSZ}blu.XOll*fiLtOCskgxƻuFRɼO`<QB* tH9uOOkͶ#={nzTзޙu鋀< EZ Ÿ5c^IɧmWMv{s&*gyP9K-1(v8Zh X.JZ5 #zr[PY.ԆpM)ٴ@;uo5G\47u*] z`(I.ŨO!u1|T  'x^rT/&SY&D+S:}t>P;hF}- [#"11XNP`$ܪcʱLRcc~Snpw,rIxi2u]zi2m40LiFɬ MF`d'P+!Fn-QרnmScAqdvkwܱZ. 2%c.BD&Ƈ q'/-:;!$^$16OMb輹}D5rg3._pYKH)V= ݻw]l,Ƙ7[ L@n]`hj:t8hE s@mWjn;۟~y]v\>^_y5$aɘ!)a8#GXQBhCGZϑP2\x)cq\epO[6؏D[x6=1d^IGh,'1  +Ks#HBr̕}>\k68K\ߧmoׇ7Y.h}Sg;&|}0^.BMYWUŐHqsnO~#'#ɛP}qNЀ;x)s:6C ّ#~h_~z]7ǶcdiԞbwҊn_)دK=7W070IXI${ª-%WD(y/~3aN:][1c@&D3[iw% Ix .#QAy*9",V6ԴN2Bh4-TK"B茩\"hTPQ1X4pO@ۿ\i)SfTIkR%PL+ָm:Q{Nr"d+(U<^N!4 Ҋ5^$oVE q}vUO!sq>_ggQwWD~i(Uwix# m/*|*[Q^AWԓְ)^=N#^0{^]:3~~C c0#a1S]35p:?gmW}2RRW +1[r{km3hERq2^ EX^C$*g@)Ezl<7;6Ha`mׯQ"c%o03.e#ξ=]@>eY4!۹ӡKfq{Y=`ypbg[zq6egBAT˞)r7#<%e!0.V*(=^ޖV_xS~!~!f1[84@*sɘZsN&Dj-˵.єf:7R' ؼ/}}3-';L& l֝Ǧ7cG) 1x*W(c#(߯ J(CTdXX0NaYnA 0Ɍv8Sc9a|%ʵ> ]ƜKq&X{Vìɤ$,''=8u0SG]b$d&dDA^E8c16cIXHeNԳ{O%gHpE,@!)%6&*j&;eQite]ib;QL l8&˷P[a]-x.̑'O\j=ZhMi[խ|Z.ߜa O)o0гK/den';Y=evs*e~zNoivc?'Ѻ4iKu繟 $ zp~VnZ+J#-u{xϠ8Ԝ%縅vBkkjCc5B׶p d7OOgƿ:k9 Rz`-8o ̫J_z#§O$A 䗿=6[# Boߙly E02W'W!8~P=K&KvQ9ov.'/áJgx>:0CD z(Q岮lxNA**[i(S&n3z~=^nSN < thsJ/~uy5,9;]M,Sx '={έgVY|M2#1PU&Ŕz[;|Qk>YgнZjs>?UN74#LJ_JKqY'5тC8qџ j2w()~?D]ƈIbѦ&rlͨ8f3*4Qۥqx☑6聯{q0Q8ԿQ%`"8"#${{_z}- p{dkNyEZ;nJ^|NӉлAS||aAL^|=⣆kmFH`;FrE\*oQdXc$^Id5>;FQ1"H|tĵ~w.HJ\Ǚ)nhԳ{-"(+#wgQ 8m1+6i'.vvf;YDêJorgYNLaeHNcW͈Α dLa)~i<0v`k821>)ab ka :R 5SjLͯI49aLswy#hj@u-NKEee2$3ݲ %HEhfԙqD)NR֛nj̄BX̬b4aap@Q*?IZb#O?jV^TC-Q7V[t8VLF{4*cރ#S*h B)TWTL1<؇ZOb.YH/-"r%6L|p֬Kyf|7î;ninn'72߭4u{ jnɇܵ5֐ބ,>r5\R_\N Bʺ# fny>fBb-e\n13a696AgHZz{HFoT/.2 NATFx2_-H t2V)QNn45XjR[-XֆpMeuSHú FtRרc"C^ѶK| 7T!#ǧ~U._ g/Gյ5A.`%kU34SN1H&TgHO90frM{|)KE"皂b``{yYu)-n8ՏAt"ATF? !=CBxo2bY>%{ BLLT!)i*X1!|,%ѡ߱3'74Δ̩ ,bD;%5JT"EjSE 1 _3USIƈpTiThZHu[o=~B%y—/fy]E5Lr,ؘ:DH'Sc[zokoE-n1I{l$ܚDX?;R[JFq:{܏Ht>bLz|ث7)qrɵwuFnT[1.9q8RQ@xLc$-됴vi~m_+1[sj|wqc%n5pxw.pK)-uy.~aM&qH$q)¨Rdw\F-7"cSKV0,jH YMH%ֈh*6Fo%'Fܫ%y/׺&*CVb8ٮ%@ wٱ}+$87\@(KTRK5e`‡Igoq,"E3-m1F*4` H>F$S,SZ!@`5CG +yoZ,w ` DG %Go,.QT (u:Uomn]W{UomH $j;Ɂu hRi#:kԱnzfݒnmH  UH)5s\weMnHtebwqÄهYiSM/ْ M%V"Erjv/H$2nWj|kއ:>JՍZJki֚ʗH{xݚ;| Mv3 'KV@X~N.2PдY)wtnkCr۟~~<>Y"aDIH:6\|=i&$ɳdf}% 4n\eakp"6"i2h/X'b, /Kgƾ,\4-6 +iyԡX X4<N(eY"d_nOJ*.!II%<}JU"!w)EV%EW%%Ih̆O4Qm@#R[т=cpAjRkس׈tf)e;F&`*%;ĩJ@X5a2_!,h<]*\BjQiHEvWIc;Xרf^ Ғ+&q;;A9[1NFu[lm?}兗ܖ1N=v^Dj#z?쪭UBy[ qŬQ6U;BVs=O~ `4~94a\Grl4q">_T=<܀{&akW9MAhΠzq16t/?MN W/7Wb<8|04)⦕noRFW20asyG2|L+C=9g^rg}:۰|p *kY(L,e=AuJ}ARMA("0Wʾ`)RpJ*0Ngo:/Eu9 !9oiJBdRunI$RFVաS2_V& 6@1.$ⳙy)>/Uy.jj97c?Vi`ٛ,$$$_&J" <  1iX:b$D0d60F}j3Dͻߧ#u^ajIyɒuX-32ك-T 6 (h B/sZ1v$'O+#Q!j,RF0RZ`[ࢰGE ޔOm,ԗ @7A)zBRt]mמF w(Ej>2J?:u~Vfy9r_a~?> )McoL_s޾`.`?ftW[\^ 8CϹϧ;x>Mpjha8)\|_ I)R }53?37S+%)a r'OP9eTsj/@esɪI7bJyhKǸFՀz 3ޠC  c TY@+40[Y)JWn{}uJ0N^Q̸у233Gj0%1KD*0@jƎjnhpB\ l`BD!Rb"گtIBHJ+;ҊI+ς_k-K`%sN1sy̙+f끏UBKT ǀwRs!Q3O5Ƃf<7,זVծ&d0 ."cWeJR+"cJcRNnKrՊ*5/lT`MM&f߽ ZQH{m{| tG6yo_޴?i3S1u3u8i-c3uC{UPo`o*gw]<1΋mwAθТZۡzxf$3 W:eLw׺vݑ|^!$G&H%i:p_u\zhpe4TPaOi| W"R0XOaK#VC{ӲMp-Woф+{߾ KKg\g^s=MbX89uhbXX:F ,ۢ::k}v(s| O6z|Ô2Q.)fNL )#=V>d;ٱ%= N zk ** CF4ELr-rGq4JEZN%E>DjMqGލJ~Hq+ D]>*tsBmIqiYu}m-!kSˆ(>Tv653<)*Lٛ9 Т4)Yx #T9j|lDXZZ(%DuvdWY᪴/iU2s-WA6nuq@)dِ k? ؎G"b}3? s$OƤAz2Y^SvC2{ K D<5bf[7Fn ުڬY7 }[)TPdGI^S5F)):zERhPi6DGI!H![ٷk5aɣ.g҅EVX ~CY3PM-&P٫-GX ԕƝ61Kw 80 ѕD؝USA Ӵª @gRZru\ԟQJ${{Lgpq?y?xǝsR7>^͇JKU|ⵔ0ZF|7\IТiyϊn()QRtk8P\!p4=b}41a .Ҩ<L3!ںA@Y=r_R%8{C".KCتgXlP& D {D<[JC!""80#n4 D\;zbAu_enqt83dQBg&=4G(MfS3_B:~ "DJ614ƊtdwG4/a 窐nWP%u,"F ]h;O@T;u}o*(d:Y}lOoS*#L)i??˷u8p'(ӆ*)CE\4fS)x, D3wu,nOoGSd\|_>\UϘ/(Ba2tΔ9u\mr?_D5ogD5۳ ([sp Cf_N6rè 6p> wk/R㭅 4mJ'=Vߝ.VңJ Gy p7>ϟ:Tm1NU{\O9QĘXrΞHqYݎ%`(cǓy16 vU-J}v( ]X8{+HY;{ ARP0Igoa o2,RkN}v֐QR sji+܉kF /+pə/AxbbF k֏$yZp53 f*aq&ִTcW)S%D0)%v3\->ruZE%Ti#y G¼<*Cm`Xp7SM2خ%c=x'9KLVh@1JBLbt"ҕFb\:hO1RDRĀHEQzFi)ݍ )ā7 \91aM_0Zrͫ%t#1 FzV,#f s+ARbʅ54 R ) ao0UƁhpqH(ˑBH)Y*bXN8If,Gj8X#6l%a!)Ri0ht< Wy!&T3[M b$"h4> Go-Q@rB"iEϡdy_~./#xU4X'c!b!Jkp" a nZN1-;o5g" G%$wSK#0l"[NrQ*xϔ $g:yz:– qTH *mCB<k6 V6ڌk806SZOF)o -c4rg_OkqAmN>ITTx,Fzt+ftyu:@csY@)8¸Q))f,^"PqQO~C8m<}{]0;RT+6?rI6쨁?2RČ E9$GQA=0llYjTdt<c2ʈgl?(ZV%̑}A$G:ϧ䟻T)JZÒ.1f)*@3%SDC'1ڞD$FxK#FZq^D:f0دXWv\ vP6+!%583eecv<(mGͽmNsD%0cK.]@TrCgF e;IeI|Y:j' ]da mٞ)X-e&O"Grd: ƓFp)^G~ $,0Y-Z ܵQ'+Xǰܪ}A꠽U6ED`$y-4[Ч#y?OK͂~:SX<- ⴤ}&aG5D$3+W̦[2+,/$Qహ4q2O@o-s3fo3BJn0J Jh4Q B4jvZ1j6Rm\"Pp?ƴ+㠧03  $$EzGa'mΒeX83HǙ/uZBt'E-)U,Fͮwջ{珕HdvNo~ϯPUj{wW_:ǡ'Y~ZDkLZFrvcu]y2 U?F2TR⻛U\ޮ*[|QFkfoZ7\\l]_z1w`|74dn.2ՇfɹGb3q3U娦 px MU%0r0SJ -r˙ڡ4Ɔ)ύ.{Gi`4_>3%kڒw m] mh$3VN,|w$/yGew#M) *"sC@4E҂΋;"{Hpnn c$;%xdV?6iT!m=kc49GONse3sTQ$F\O.*nT_:o<:)95 s/"Xc$ׄk {ܝqaYuxXpE kLUTt4(,.?w?bkɱovA74m[:Q/)gWOf0gM1ӫ4)i2Ҥ5/.^RTj&HZxןV/>sw1i: ~NV Z}S,+0m>D,D`;<\L0 Iz>d ąeZ>=hr.UjaT×u ږ'C:CZ?ΒS$D@&9xcE_#K0ͤ&":K# :Ѵs;4w羞G6e5g2n XSnk<τ~?gdsejiB|S~=>ؐ,6fnl}}}2QBa%ŒXCdXi,k&l1Y22 ,'KZs`Ԋc(4vJKp~'b_RR>SFG-ZN-(/#ɔ>%ʫߝB0;H\' fOup \” xMw1=bs %m>Ԣ('ɠm#StN]NΥq8^d s¤LD[-(j4i,IN@E3 K;C)eh*EB\Er@]ݜٻ.WU~<]\4oQKK\qkEs)<px߯E:z\EW䂕ABD zL($V0x*֚Ln4T+Lv E{O0.~s4> Ȕ\9>)h 3̔^*z6ULfմTLJ3;KFd-B5#uDX-T'Nځc.D$xJ\g8_#IsU*u~znnK{Ê/wwo}*H c~{GWE?<"W arN>/>hN׫o0zsywFo;gz\>l%E3u(go ZpQhژQ֩J)Nj)@@PS$%Ngæeҁ. /'AUfeq`9{NPCȬD B7$kD$HJ299Ʀ!2|s \jvz$12EaNs'RP B*h<$n},:AAQ= ^'I0c Tpch+z3mND+{.rGdQMSQ.UҚ6P:Im9ZBh 4XʵNMHQAJ xZFXot5#6>?i#㴼<ï&R[~tx.渫عϹR!*1ٞ?^q8֭d[]iH6FVĨR8౩u}hEi٢Cu#[zVkEC@xW<@Z(ةFp|ȼgڿ|ȍFk7-Ë%ECAbX!l4Ї>+WF`7 J䱡0`}6B2%ګHeYi w&0l1\b򨮈E;Z%8W*bdf`I#&eVCol {ZF"=g}Ӈ?xzJ`<=j;ю'@jʼnS_E<_u/>?Һ迂*+ɉ߯S9) .y]2JGeij%K%p;ELD,~:] BAI'ܿhؙڛRUz[㿝B,R4\_?ib660G\hpqF|w5s\ȣlWaԽ+,SL1!|x Q ز3xz裙%P}mD4?jQ*ˌ0+{kg*qP3SǹYt9x8$9 9$\J9Ji④kɣ(8 ,S+V$Fo%y_5ٻ"½M #-4 7|zUluk~:΋-͔ JH\&tNEs!~9!:|ɿ4|*yg); ~ƅ`H>W9B @;di.9(\wn/OjuK .y'9%\X`T𠬒~+܊ɯ]d !SMbnDRSɧ=~<ѳ', VI*!eD 1(@TZ{*DybB箋oH(eZ^S̢*}نTO1{7,mh' 4\V#'`UbU8(3ĵ@-т@ ܘhv֎X#`6<iW@îʬ 7N>~2-` wʷq"G:<+A1r}ʌZS멦Gkӹ ΩJ!$<hcrJwDf!+*(+aӎ5 f-}竄AO)F؄j45 *=QLpRrhϝ1&5HQǒPy*g+hO?ҙ×NӮ1|jP;hc{xLr!?@0 -Bx$ 0 1.cB.s>l.+5#yL mޙ D>cxEA.(+CPXJz_ aGQ-r^Nu*(<t`Ǝh3Q=$Γ^1L۩6dR 8ZSdB"K*RiH,:%΂11 j~n; Cwy~&(-O!xEFl \(I4h" @#&hr%9E)j#xHi` M̟7Cb-!vU{X% Zg? *]->נj.GRWU㪪BWkR?[`t\e +knH/Pu=vw/U(Hy$jY )w@nQUfeeee"W@pCb?#TyR:Rrx-{Q9S=%hM4J0[L|x֖l hGwՎ1󗏁l8\yxdKw._=UPKqo"O0-yh2 pVXxpFh΋Sg17)oWv5sRlb[O4sAF(Wi9U~Xbi hYiE iG fYT~ |<_WbO5RIj[-~phj/ea9i3򛑉ls0=??[a``)֋hv7w,j醉#9SGXWnD6 yN BJ"b} ?ROq)^TW b務=4h0c-4Lk~Vh]`ҿMϥp] X |ƎOAG{2FQ:m[b0,\9 =g?Y*|r@62.gy}TӁJ|hů/1;MmkT|4tmjI42PF YPF$"Y^I m~Ql EiECAvГTR؆ZQ]H!HYr0d!I0A1NH %q T c5鮇NYlچXK3*˅ ݗQo}?C<U>a-֥ײ] Y w;4wˆ }Q'Cv!?6kŒN0}] uA6ӫq%^';nӘgrlж\sm'{a%44N#x i4 E0_9;RH{]4H-bIogH7=N ^ұa4hxqFOs|Ɯi8$eB5V޷D"v UupG-՛(#':g:P9i6#')N"(1 y:-j0 <ԛQ=}V6Wy3_ D5>J/*nQjM[L#~`2͗:,hrc6Uumbn}c9ovyIQxAPt܆^vgF E6w•Xoe[:ٖ(Ff::4i2&RW<8hCřV)) нlK V朢7&O ;4" X(ʳ(J҈05I5ՔДDh08881cL 2#qLR B!SF )QBUAIưJbI yjB9!&IP`BJé~`!ƥ{m]He:pŜSԀ;-0ՑAΩ~( Ij~j#CHBw3ic!zlsypһk\ i,$ _jYtMJ׼,,]4;//] AUG\'U\ˎ(wM²®I/VZ풍tfL˸û :D} %Flߦxcc%Ҥet.p&m^vlg0!%uQ|=E@ BKd$I'VMљS_Ɗ**6]* Z.E%Q<`9m (¬rI'u(TWI*2Vc( VS߰72WP=$U5p}{d5{}UT3TV>$7X ۳}[&'U[TV EpzO`}yg)30ШOݫ" -#͈> Yv&%Y~KNnVyK@luyM;pzi@Ud?kMB6vpHdDN184.o&hf&[)^3?5 U<]{y4t3wJ]<.f46C4DO",}x IJ-CTף>ÿaL&s0) SyUf56pg0Ǿ{L#[=r- aYw]>  `ZI&{D\mo 1AYwD9Gq Sy&P~u@=VWhQPzZ:y531t}N;]ud6EH`XΨ*%l3HJcS."G41"rXџ˃ 'Am>o1۳؁40h0#c) 2̟aZ{&?mPov:vhC]Wzm+O3%KٖELP L",3B-PdJS!#ŌI"ȓ U# vjx;kw!/'^0͂)>͒N 8fJU7ozqպ7ͥ.&|&Vή>}IkQ. p!#% _2Enj[䳅M5 I:25Wn5q#d2ˀBxj$LRIP3MMb#,; Rl9J0޹(T:Yn*Iv^|p(ܭ&Wm*v"v*? .@b&Ҵ8-aEJٮ4Y.֝"p]PL2Ҷu[z@- wr];7b,{_j L'f 2Emڄƛ4{0vd֖a7A!%蜉S :1sW!Z>GECM"*4qӂj^-T15 xB5WXZ}ł, Wi>sBGy\viX8)Yze WT-QSzw2`7F.ez B)c- GD8Sgd~d \Ƌgo͜+>IG/A W[>:xTJ1B 7۟HS#An7R1&2~J>u{_H{ޚƗ*{o Ԋ78>欐LM}?gzgdZ>؍T=1;=h2NjgϢW]7'~;s@{tvKg{?P 8Y{2GDp4/is)H\N|4L9fb0# `(%^x48 ןSi6VNBuGy1rѬ/E@JT = K(fs6,EIE\J3~WH_&.>G\Y;@+Irq3Z辿u^mH_2zeps^_G>2΁Z~ wvpOc=463 %V HmA-ptqǫS + p'XOo,ߑ T[^b^~;vIB\?=ͳo4.vjs?x4L8}zފo<~ +? ]PG6upYߕV?l}*PI>H."DK 7Q3 o BD ca]x,:hbNC6&˖0_K BԙM5)z c/NE%?,wUߴ*2.4ņb'I%_1{H\p ,%iN(ן\/5LzCuh-|~!fqtɫ6L_bbڴ܂5GRCmmVλX9]|v%%X}罿-6V3>0ۢ-Q%1^j$mzu|L*b￴?k}sZy2/ƈ~L#o"v|us[.n1hRT:ӱ a|=dQ#&$إ6hkڜpݖC>4Y՘㨾\)I0{iFg_b 1Rه. '-2ǿn\f/GN罉hgz4O-7%8AFFeus’b$~#y8IǖG`zɑSR.w7}F '^{H @Z7#CflĽ%PJJGK9+@XDzJlPqB1 MՐ\73Grnkzm9M#Kp Q_KΡ٨b uAwُH1dvnv13#)6BPH 9l @hDKfKؖVYTMFBϯׯׯׯ7_ iZM< t(c~JA[2O[("kFdDB};Uho;+@B;# yП]4i="& _[KZ vdiW}%LgEV"7^$I_ќ *pų w@N9w=͋uEt|HR&.฀BxQb1:=iѭSJa?k|"e^fd2Sʾq cFqʇtqَӄº}PL$b~Y#YowkecM/t6A+Ze It7[)ixNL>u#̴}"}ʺ7 _JU}dɆ͟N@tDC2l )ӿ[4Lyi@'^Vr/0¾C퍓Ʀx[9k K|i/WA"ZFɍؿYly[K˱A`_w%%^ qf4ٛ"grMCN6ezk/YiZzg>mɈqOAkwGqe\bck$ϼ͇됻i'|rه-5r#[[O87꬏zϧoLC|ؚc'zrz>G6vWCîHO߀܈wv[L-Df2H-bוڔkS UBNsk%>rBU]N UrBmLE0c!(JTǰܓ0|N&>Ob{rE~j`F8SzKXK:'8r bqӾٚ7ҒcCNưc|, ,y<`P1%DY!'LKKͥ9_.7uс -? aji?Izxtw KH[9SKP?>}nۼ䧋"]ݜJX^kNsϟ>c{dNSQ-9?߹ww>?kZMT(2>:N.N:}lM:P?6u_uHaω;Hx꼎A58PFb^&ʠUPL E Mt oGhkiYCZ5WYnJ1}[5ZO je|?^𔕅}*TxUw jcwM. VKs!7݆xy >?{ R/Ќ"Y=<#1. dd`@EHRgim#q-_s3:=Y씗elOkO項; \pv !E%9Dҁ Su_p.jБ=I^Ig\HPL(ԼܠM U~W$;fOο|Z4PJ0NJCY EVS5f$qʅ(8Lրtw[Hl%3 9'eZsIJeI6G 4 6]Ҫ$+ XolNtjTe6ɰ;IǤ&%dsجla:ǃ`ޘOXBO8B~ =OrQvQ?ơ~IEoPuVgl7ɍ#*Zb)^6BRxGk3?޵q$Bm!՗"Y,llveIdS=EbCa#LW5u ؁2!R 9;W&IGؔ.X"_޶Tl%$كvEDHɤ;E^;K-bMnYFJV'mXCqT9c1C ГoXdNG(M$SƠґ9PYcbECRrAOnvX’釟./V6r(f$AќL`PdhmI^r V_6'eBQ!fD*6QrAQ Ϥ82ioQiO$dv1N '|_Ä<8z?[0VR@b١ %C'5:\ PyIC}kٗ.Y-2.]sO /|D1X{^c)  !<)m}%9r2uP0V+I6Zh>]}{AIH7N`wa'aUڷpCS+fz)n.0| =/'lR|(3H%#t"Śm@iɞЪIkR`(d]ٲ+)ZY)O=.SN,RXهQ2&eA/L=#O<ns^ d-O˹P ePj<4,a&GjSXd/)ggG(:wkL<=##O#WN3 &O17i@e ZqCzg1i|g!KJ=eY %t1sR,ryZk;+^s'WF[s>C٫Qpp3ZɛJFӔoKìu> V/g%RY^nj9rR,JzȞcMsJn'wc"C0j‘gk<'t{Isli"WrD*^ڥ_ݑ~J_4gÍ> O#zEEr, Rq xyTRG# b~ץ0_& [$xrtTI%/57$H^Gd(,fe9.-*dIGtya φKVgE}|}vӇe|O\9gzY胙7Q;*$O[A {'|D/zAbݶXW^ڦD8 =GEu(ER .D@JS˒2!ʔ(d=^n1+^-nwBӏcWHAyq0?\}8f 52﷚>]QϢQ#7?h7^r7NYn囦>@۝+;&v2ڂ^nUT 'qА  kwxw"p` MftjI,іl(VYwoZs!4=fZu$HW3.de֑0}T~g_fwE]ӕȅ>0.j'ٸҽq /H>%{g\/0M.eO2m鲽3n/<-Z#~u ժ,u.˩q;#Oh<&FGs}_ejme]]{=nqA9[:ۛVN >cFyYЫrxn.ng ({˞L4{oϏpnqMyw9;j1{G|&=:HB"Cj= dF/fxRرZh>#w$3_wn'9v>[*Ƈnn|5_6`[KŞ_M *2d{R1ݹ J:]77:hEo.e'W!eo8t4( v&> //zn3(8*yAargҾBE̼PQm7J-OMMwoǏ/Hі}?fLI2 q3|l]b9GRB̊TVndžXYN_&,NIqA Q*s0؁?YR Õߎ!f(ŏ߂c'RUB}f}z؎BisC6F}rډC{znLtUj\8K& 1C9LĈS8݁Pݦ-RVgcK()JՑlҌõ(^!uGq 90F!}&.-Әu2-$ hFgFjܦ#J=qwثo;kmN36bӥiCZ,z;L0z { j9+k (ƀ<u/ 5SYЮ˂:Bo5uA[0h%iniB5M+:Ϭib[enA%.P"Sd etOirbH v;rkhGԼ7jeq69xf尣FKW8%)ƶ慒dDN&tD):ä^H˙ۈނg^5P/Dٮ-3+=3222r) lRHL vY-Up"y/@V#4/3hJ6f9Hg,3=#!qbQHI2]tw(u+xrC~6T"w܏Fz Շ?1 z85W;i%I7ow=rD#iSߑ+? L!3Hcc]Z")3)c.x,)K.aI]kbib LNd I9fIIb%8a[ZpyȾ]Jz Ueo|i|HWv2N N9D1;@pW>!f5obvhӇ7 +d6).:`3rrxY~bx-uT4}^s%ueoZ7ĴƼ]  ؙ{]ͨqQ{ V=fvٹLQ5fۨr0{7; ӕCdd[{g'oubd>zǸːΠٙ ˌ ;( ^tOAqbm9(|^ gp0\ܘ/Y;}8;aOx jbĕT*6l>ag%<,46d=h.b)(MG%7." Fr_El[ti;T.S%e$Kv׋&<4'CT3NפL:'$rP@Xlѧ8=O컆kZ_aZ̤g^CCjU13hw$!p^S){I'@&mMTUI"RZM;s~v6eMȰn6TH-+ $s.%Hޣ:{(μVʤ2k4)gTՒV]Ae)xk9RβTF':Δ^RRY[ZVR-5ӭv$ݰc2Erd*Esb\#̭ ` !+ߒ:ae[5 k rv)f:8 |wٶFd~_guB]"`Iᤈ,i^ N<9:Hl Vk EJ"$sJ ۥְkZvB}zM{MBsu [:.ɠ5ذ15l3 [z s [:B[K5l-]!0Ö/anttEjr2y>3KTJͤmF~ [dρhRRpYh B0(  Y 5]-8EKG Pu% N!_^~ i<7̡ wcV%v\Hp|O+xbNާ5EN@[NR&w8VV{IxEz,Zybs$fnH;*/{~N.sJ+3.!J  >.7_v $m/}ܥ,#u;h|8W-)NG0\YzʺF}YEqo|4ƄR\ȧjcbW"zF]͸:yk!s8!s~6 `G<(?Ӳ{ۇ׹^q_r"ܸsurzU+'bw@({?/wkuխs7Inn+#P v^&$aTqFDDJ:-HNV 4&'P+ht+ld4!~nG<ٜү8# gsQmU2As Ec#b* ~|lbQOl'LD:T`PsߣfЊ@FK۪@̀kJv]X.&L < N[1IoQ$&QRpn=OC,JWzNyP%ţ{]qDlp*. 6=4Նۏ{ׇv׈rDger+HiiaK9z[#{tY.@]; y)QZϪuR/7gNIZtI`1l\#@73ĩJΉPAoB-T F6 L@Bi3Xɴ{89 rVtf29C.Zpk㷯njqmGZ(tTK%DVIP`Fib#,)\I?ᕹЇ# ­:@T q`u"P#í/ ``qِ/VyXNP(•f'y`ys!Ч4Zy!|Xħ0͇h>bZ//O&A׏u;kE ;7b;L`D\NI?χ߰;vKi5R_Z4L"{yvnwdT qMvEbyS?lעi*Xc \>/\xa>:ղ0{5T؋| x?F4|7p>ƨ!=K~X]Bz$50fd5 {sʊ`j`xG;kGs1<=ƐCfcHzX6OcxnB>vTE7Kƺې+, qUqBQyywi}^ŧGӻXyR!ȤboyT0]6;0x.a&:j ZؐȐ SnLsΉEY01Q_aUSlۗYQZY^Wo5Z*Z]iruكU옉aS?7=noW{!F_ި{Q@]pmkQC,GYo<˜U?(JG]/5WXzsGONS^0@Jφpp$6;ӣL?~; >gN cÈsGG8-lCw`=PNSÈS0G܁-Q=;:1 Q ~"yd۴= aMME;5C)@rFjj x~8s [=xvɪn!,[!PU@(כ=U;8UasǐPH\!kϝP(p} gdNU@,# ㎡ !.aP0^&Q` Ua,F^ AE9YAG{k Ay;AǯU<a5=:E}H@0Nos*U/n8CyVݩnl忬M)NXl$¾ |n7B/_jwnw%djYDP9GpJ\7p.MCAaXbb8nnh{G/zyK9-| hr=AP^E1B >fQ,Q<dn=݆pdmi/2!o_Fw/ K8E'0Ѳ8"nyIdpZ(ʣ!]?@&|/C%]pcM\bMLe%GA|17JN0Fe0:l+el _jғg7nP0CK~`)Q)\a^9灧8<ЍSH u YjYRG/A6j    ƂiDY#,A.qC%A(>4%"oț(ϫZ_ч_?9͵xn_Jdc#dYN(yf3{"7I?v:Hh9ۋOi};e.s՝Krc_"#dM"dO>I}Hѿ 6W|+k?W%pk|OI*~lԳUmM9ZyoMgKBۭr$e윪Uz<@z]_̜ciFj "gAj0]z^ԊHF$ycsP' JO8uE_V{8 d2>O}p׹?pet~z4j?䶱yw-kmH / ~w㐳{~8B?m2TFIJTc(a]U5L[s^).C;-lo|z|Hﷷ}3h13RCVD9o;9q6dѭZ7cZaݚ:MǨcw *݉vmZjݶh#LZb[C- Ǩ&,Jw[SB}- !SJGުu;ukʃ4u;\Cn -n@ȉCl}/Ia[S@uLiex֭1m9q6T߶]nr8֭)Jt:p}Q/֘R 8Da^nvnMyPcԱn뷞e]ƄZm !S=W!~hal8F-`2dj?߭)žrm)`/\nB"֔E ǨUrXb!M -BN0šռ6Dv]V50ϕYU B M?ꪏ FcW0RUt*51j`U5tU [Aɱ8Fv cg \vnV592%ZX$%rf:mk֏(w8ys(0~鍆q}~; |fMLQ]w_7gf3{=rh)Qz<2>̥Tl)j@ToLW/LWk?"jG7WwF_Wy\\_GEEHs?n@ d5h+#O/)5"sw<Ms(Dȵǃ5/n~4S壿LKFH %}ht. yqEP+J][3R&JF Ĩa>&CKtGO} dӧ-z@CAػU-c%%u8#Y&P;f D4,FB[4nN9i(* ɊD05Zj0ϤZ%G?=m*Z6\vj9sFt`lA1R?E޶s;K[FdT}0 :7T>a&&-=)Vm\0.$|-}--vg"_4ȠՎ[wk,^(Uú`~\ƁO_\r1(9L*(w- ӫIy+!2$>^F7+h{}JA9`׍ 4}uGd{V9+Isc-2 ϏkTQ$?KS]e S]]P#Sco,'wikZ{ca|!w>ٰ ,ΠO:ͅhJogLfBY& /^ۿZۛcn9N}T@&Z@x @nx&'|J"#| Mtq}<Q0> !*~[eaG26zv-:MmSr4u\?Pi;sk5NnILVod#72ǻOK9 /Jcw}3ߓʸK_wN~ (/-spg~ZcjfhxJ yKuB^k@)SxƄ !#3r[@B?{bQ8(!æ3o{̕q\.5HCl$']֤ī;*[뤳*um.6ZQڴ/RBZw5ܙf(u:>_h+`%"6DmnkY-nYĥzP-7ުȦb~Q"e@v<˭g;N[yl]#{7/Y $e>S-4P$\ܜA PtDi&hF]o^&Jx12VoIKKNXi.xK= {BuNMDD[w8-{$>&%N3YJr+ʝ^_E/g?g/> qir}z"A8jvov|q5p/r'\,^&OsJ\Q'A3>ZQ'JS&'i$9F+cF)%M~N5X(YȮlc'S/B"g9x4#&J2im¥ GB)Q{/$Jl2F,DƼ^RfUjB X iAE#HSJ͜H5'l-yCgN-` C)G@ }P = :ސF? ҚHW׍$hF.9M!iʥ'" y! 9S:~j!Ab>y#@ h "]>|Kˉ.d>BGo5Uoz{k:ƒuHDr3)ahl\yF0ɖc 2*uh?\i@ ͔nd-f^#ig=|վ}7˗&ejs?. ih* ?bZ$2" Isʩ-?RjlV2%Βu۔r]6hɡSH. *IgTo '(rBP}$%"xN]B zιiƤMIA(A?yT+$̰ VD?'mng2_zi&Ho-f͢Ljݯ`iU|>z{<2ϸ3.x0 KKRFȸ`:& y1 4|.ht3Sn&䏗yz0 !k:޻Y`f[qϲf6CAWeC瞔ОZ5ҭm22'.s22'.ɸ*"(TkR^8F<$tk c)9<7qP+t1M؂hE[m9J, B.$% +< .CdDRdBʱ\ZCʕ)HCIIM,f܊eS#f򉭚كmg|y{]3$E e/|\}Ya`|3 ?C_kmH/w{Cu-l$OK E*$ ~դ(f3Cڦ$DT]$S!q\R#>%*EX/E Z H,8 F5b0"~F&z6 aC S[MDP 5 c$ҹJ‹˰T= zyX7m-0餒Q,e8mdGj,Q;ڕ)b2ޔTƹKRq97G '64vYDk(O[Ss]D$i"*#*l$$ toqbCa,gXMqƘv59GѣQ@p9ƶJF(P%(!l`*s`k\cTMlHeizLͱឭ@m0P٭&K z%|#i"ϙx̷05Qf|( pwPJ]`@zw`+f@iwsBvСJh* &a37gFa iPei0iq9&BAXK.lja\ELHQUZʤFTy5]9~`Kj5RgJO-oojɵv~u+yk|Ip4<7w7/..+vCBN't4<<4j*7 )YYLЪ[\6G7k,{dωq0-(W {jsX+ӣIs9~o&J|e*RYύʩwǐ{{y#b?#[}2:yM5zKs K۫8)`Dׯ,GQNoXr#ӼE%hÒ?ۥ}coV3I~Q]3*m)ϑ0Aߧy IZwm+s|6[o2=%P2U7^bXX1)nz`bQDŗk#Kքfy,U*(K/h kX^> 55M) piz߮͛#&Ϯ1_[BW4&_]r@3M+],h¿f_G?#r[Y[{4|?..z?Z?].0Ҍ"%[070A2JRbƫ=gŒ44yW~QVNZvssi1MZTʁt<X;?$䭍G:~AwJtՋV0NhLm~mp_8Q榹*%xb2TɊXCx T{ޱSZԚԺPqxmADqZ,/ s;'AujPBZi֏HN!8$2>N_VlH&*:_-%ZX?$2q)g"R~hAC6`H~K͉\ZnA*%\#X(&S$ DSvZbo: z3qΠ >1i>kZtR@7$e^0P)*\E- Aۘ~d7/}Q~ C+AH?z8Gqz1F1$hqa.OA,5^@ZQ p%/*ҏҗ AAOEO[Uۍ )-;|!LP~e  w\YIdX*w߁Y0K AE+YKEMX OlRQ4--$UC$(O( 0ibF'}3 &obQ/0Jː|̬hhC?*|^I) 4F,1_m6h||_F7K)$sTT7*}p 聩T"#xރHD LpN!xyȠXX kڦjcf*yg$z_`>N0kXe6g @ƚEf#e>Dzz,L1:Nfo9Q:(wWs.׽< ҰV',(=nHFQ/CO]SAj{klw=@TR +{jVizW?$]~ΜQ?to:nՏ Q?Nة~|F{[cJsHa (a{8 m.GtWѯ d4WsF^}kK {r7Мtd W,Y'#56Q '‰$I IE+Isypj0cc~'s aTZ.} N=>`[bu2yC`+JQrq9*M"U{ޱ(aфZőa>@fE%YU7'n ׮]9^u,#O,zIQmF,8c3* :EԻKmJbZ}ԧsJc?9h=% s=(g#dOr>eKҐ$yJ"wd9h|%cVT WyI@h!Vl(؃ Y+F'> WV\$7yH{qz8 rEn$xJ$Tϓ`ՌWGQ֜")u$ TLӺZ<嵇YߖHOCW%'?[$A%zIH)t q n)𩴓5rif[4;, s%5(xXM_&>&>&>&>x6" [^2'A1FP5(\8(⚀{h|T@鋊 (}u^K7:DW~XVC&λ՝*|5h!&Ê.EJdҺe!p¬8?_-B|CJ twWftTۣﮞ\Z 4|Ho uVN,e'c9(STA$qB'`($ RDS`ȲR}+MS[#_b˔M-N]O| ZXovЁ. y p>z e \\4v:c65`AO+<:k<R>ФUA$EDC v(B !FX&8>2 JXRjgA$YJ%EQ6˛`<cl!zD{(K|h= Z~;:Gi&ŀmg-a2gh^wTJ9<Qи*(MN%^m=}hǿ>ڵ&p^3=+D jqo;s?mp-kIBP3UQ'X*[NY%ƘϣNemXB(}_u @ :/c-gXΈG,qqqq]#L Kל&L>g)Di*XH^5 ժl}-QM9I5: n:`P=MZz뼎*䤡RZʸ ʃZV7ʆ(lcl!lK7<ݎ@Q:f'0VlC dV(Ah@}ձOחʠ?I('H}Ő1=ju8PA2=`vя~^">! -|n_I}cpfhitt}7p5]pu߮¿?Y&$ÇRxPH͖7G($!DBO.d<'Th$DWs&eRr~d.'e{Z.S\c $Nb/d:}⪋u%| ,,ryq]xwvq?}gCr{\&ufUA'뚔AK^98.9tHqD-U4$.IUd2yEf2>Ie2TsA& 7wd Fx=Q&_ ,'ONlEߜ߳U. |69++pD~Շ5zI!ZܡDp^Ŀ%;W/w%W%;ײK[t?>]x~ wEP|!/*vx6?|/9=8 +zoV4~9IgoV-R>ܮmFXT؛4]cɔ7m./V[_lQy;ELrvcd+`!^yY^BC%})!wxCBT=4 Do+|r le 8*c՞w{ZkDoޝg 谤;ɭ3),@̘cOt/ĈHSjNQR&t6}ٻȍ#W|uxѬ`S \./~99KL,YDy{ Z P.]Rnӛ"7w-]q=Z.(aT[hmes<9X1)21ZR|V0)p>G$4m xY!pw'hēj_;Œ4) ҦJrST 2 ɀ~{Pr|ހf+#bI+rrNhZk؃y"+$V§ SFO~d?{wܜjrEOezyOw + 3GrT"&]of9^3ݚMp$Zb%!Zoj-N4Nwd3JO= B6ZGRK9NX l1yg|aז|uR& ]JmP]?i5d};:@8AܗmњüȀUWx=ʖJ3FH܂sL5 rQ [XJ*8 [H5(4Ay~6$*6>Z sw>],_˅`%NL\I hz:R#lͅ 4/a@:bAg؇Y rߠ >w}V-X@^[EWr%[.1َ'^m/>\ۋ՝h (ry[ c38ߝޙԞ޿ԣd޿{}

QJ__ ,>8(ߜQs1d)29AXƑԀׇzЁ&ooKmoȀq?ޑ|&SUP'@B֑8የLudmPǁ:Mi g "-Fudx:#Ciee,}\*EQpӫ|eX!U+ uY%z }ߣOѺU@%lD) MEl"kP"Eb1(].AZ(Z"M+6kկ-XW<ًϥVw (>qK.?9`"]}CHcNGB^wȃǃsUԕa7*b\d{[P<\3+5hwi%1 t)~2žZ׌l%x+Jh@C&t<-#!(:-CaVM‰Ru h b ux@N!T;3R.H-Y/Kɟ 6z \ p_Y͖li3Z;qa}B!ιp:- H"9'qY 8F! ؀\ T1Xr܊ }'\h'Ӊ@%T m Rh+H5-TQ2Bj)ixvL57Yr.e[~ T&9Ӱ~:%!\f~aX |"R)'DbASrZQ&P?LF;gj(!rm_֪~ }jjBD2.u"[H &>Rڌ\FHOI,nDVX5;܉땙G=0TmrBl&RC fH\eQbӲn:_җpVM@#etΚWѸ`.))Zm29L25;Xpүٌ/BؔxDZ^}+pO7Zŀk|T ]Tڢ:7$,yL%N?}ǿTwX_5Y& 8/Ūd2iPܯe%['~GLk^-lU6MPvZ6_)uC^!M3l3Ԋ욚C؏jG7=A>l`=׆1!QnXig160#Rj.!WJM)1puw!T#;+D'rdNRQrC\0j.cAƎwCCPzـcV4oG!z܇HSfpF`aѯaZR4#ڃ%C[@K&˳L"]YVi"S M)N T*wm=`v{_P刽D7w _"5mc PYT~}~XKN_'c6zj}i/Q̵ m;^1jeۈl)JU k5u^ F:pAfXl4I8h[h< kUDzD_A`txX18 º1Y$ou;c@ob>F*@bCu8 63;p =1Wy:&ƿ߀Efr=XÛɹ[. $룳yX?YVPk{Ae%ErVrtWJc})Lw~[-p$YYV^QTIZ'v/gab eABOk;nj:Rv-BT\5 ?>)FQ|b ?L5c^bvb xILƐ@/`:t% -;3t5>ȚJwۑR5RgޗqаꘄHʴO` U'јvBc=h$gSDAnB:(?:;|L؆g\B'%}+C7ʔI!NѰ&@CxZ? Vň ,BL]mu]^ksJyX'W],oЁ^%_2H o??}3z_W3Y\N3mjZ"gx~3ЃOZu;""s+0z 8#x>BJGC ldKv669x& H^3_Pa]^֝a$ql3X-m,7#zUWQVVIY6!6dfwsD٭eAm6RzM6~+!PC LSJn/&j^Y&W,zI3a)Ob7sc!!kݟ:%^I+DaNZg+%kցZ!FZcG]aإ~P4XuΛPCW(gT @UJy֚X hE*_ zk"M6d\ouNl9m`Puugfm%7+Kt_yȕ-&>j漼CE'^ORAvⰏ>lqEkq bejl裐MW-@݃j8^zV!vy@X;i cG1) 4.Tg.$Eym*^3ܟ zA|9ڱ,#)7`h-7—CEr P׋=v` UuAq;aUgVgm9+W]m;^xx;\ IP^(?-y(?ȮLkƂܵ[r v~&~0KE;-e;Q)4ȿu Vi6DCfʘm0J)A)Nz%H\D>iR \ur7ӵxYdvIF3rN8T(!6̥٥qCCl H̵0B$PrЍ Dj2)Ӌ͖3*oq`ib߃DʷԔ1  >⥊Fy5DD1yI&Ht Y7~M®swc5 hl:D1cdN{_hZQScFKcuٔ1f{pQt?3afʬҔi>K R"jÄ2_Gp Xlj֩qd184f"ӵOe?ބ~̩P@S Pe2J6'.*댞U] Pu/9^@ o1$(ٰLFC"9zS c3Fg%m`.$FkTrb`?ȆlBHEDNB L4^NB 560MK ~#`! c4:"u +yS]H c) >cZ`)E/=CId3V!лs\%\+q9E` /[F0$h үe@$joA۲?ēo?9-Ո[UoD]1!jW~mB=`0?i荨~ '$$Ox*2D'TՃJj &޼A61&tfs]Ϗ3f~?1VX k+~USU_~Ěnz}5Bżf+m9[/c2u4I+|i\[i^㇖GewuzI=`0u>`hR9SAsRb* :ʤ<˾2#7Jrg$˗F t՘i>{鶷SToN}(OGyR=ʓQ=ʶg'*&M٢G'%& ("Qr>rH2bD%=zhOg[E+Iު/_I0z`G=MD&V52LAi g  ߤu7 GL1ZAa7gWR*nۅńRcu=N/5$^x2k;API+ӵÐ_RKЫ WU(H]56Xߙa/م#+PE}3>[nWSݮ]M[QyBNۑv%JxLfjQ$҉@Jd H0|PbWpHT[N̗j*5F]fwG5Vz}׵EFd.3Xb#%UM5JX+3$0 D y((!,92=Pd*`hBҼܥ%lVf7ӭ}5?2 ކR'fai*Td޵B_K5cd%ƃkHnF.:ûϴٓ)bcNY$)P*T֢@J]RDWJ[TE³^a!.k5ڟmV""!:6c`牷VsbVʁ%DaQ! .^e'(uTs$NiVt=%e wr ulFU)7+m/xV*,edF+8A`k_/boTa/,EJF @~MAc.q_{0Ir':◓饏oB WfyۧGlI4HjLI(>V~GK+^F [?i-H ~bg?ѨL.'8;r èV/{։Bdt4:WӿL&W>;53Ԭr~Kxy1$ SHkF $I<-i I`xZvF(66|96 6J~"4CfPZ'̅` VԢЫeylYX6]@,H5jyw3&#Y cQqh9̓ЊXI/! R!TֲD<䊥limm Nf. Ț8M9Q&$3da6IW ُ[C2fp"h)ydvYvBTspFPCbV0H;oLNՈ+P[>V`#v|m@%:Sв+@/U:ִI12ZSW(y z =" ГF8WHYEadNYaDvT9(tXt_M0O_'rGyc[l{wO~W+m^_zz|??ltt|qO?F:YG?܃bx?[Y=oj+уngC>eRߒ/&S SXQ΂*F^GrAgОR;'1O/NC&/͑HW竏qOjʚv8nX~ 75Ms`唕$_tUt MwL0dFv JhCдzBrα2sŵ߇\O=kz. 53评Ej{ߺ؉Q ekg꾄,"Y3= +,;^џ$/u9 +iIru52:ܶ}ήn}M]KY; ]t D dɉ"ȩB"7?MӑflH!ffJdgU5-כ?8rQe;Q~MLa%Q󉴋L8*5;(!:Qdh@H6\2蔬Q2"aV[IzDbi Oر4Z,ۖ4؋W kHC7G:Q& x3LVdtP(pz(3WQ׌e$ ;SE8@ZУgf`u̶Pd5l8xg@t΢Z>„csNp XOL1sʌPP;W-6f7fת%m{~szςCsͯv>qg WcQ6 |9ֺ-)Я9YGJoҗ[j@4DF3dʍ.F7. j0iYdpYp"2]rcڠ, ۉQ"tvK:1g^}"v^Nm %Y;K_`qFr 7*ޕ:daERo'2mI+Ǡ$C̲TVf"YKzf{C2AISEHt~7we͍H0cRHP65/뉎:[S$Gj&(RT((Ev-e%D G,e,1ޅGL%@qVt< ǬB{cݩ1o4°#z1lyޟΓ/&fi|oG͍wzq=ԅ%b80Qʵ*FB'Nj'5&ghl➛dAR.VaVTsVb,NH;HM㔊HZ&3?ΒhM ed-g5vea2vއh`z;|? +}[zI+R[UB_8x-*2MS{l7,N֡_dg<}|Ydip﵁L:fou4o Q*qkZΨwR8A`jF]`r~VjzX-"=l3?K8lEZ`t .AMr{'mǪBYIvk]^ ӒowRS.)-ߗlkl!b8բΧGg|z9 K|shKߧ˿uHMW8hk8?_8sN}f@4Yd%bٙAss+nshsl)[%7(=+D3yvg1h=<=+}I9*ׇhiWQ/"3u }ukoe8U1_3ҺU!o\E}t*_I+릀vb拁>źPVufiݪА7{:%-uJ¢Wy?; _OҦY c0 C(=֖mlLfOv_iRiз}gjdEP3.,BgPΊ3&HN젧5lfknf'CžRr'g5ߌG7[ l]~ QTjIUUl=|VSц]Q7fcZv-3FޠeK\sEBʒ5DT^wCˢ"?y#X\-8u;S(sIX0@!)P!*q$@$\qL98u'8P!c!VН OA0!I`)MpqBp uBRgE"|Eh(Ww1A(X ! Q$B$ q5IB "%VWrF "`j`VI݄gѺVՏ\]G (S .œ8+(VD b "3dB)bbӐQ"q¸Nj(87U֪U\WTªVZ|BWUdVz@&Q&wyĒVҸRSa*5"1sb[*1K_?r7azc SQfdPoR9?RM1go`P`^ |]%Buo<!RLsmK;9A W{v]yW|0Kِ޵A󙺶v~$B/i?P7$UsuÐ-.@.!cnzKqK-j^Zj ؜n!5NW`ү]#G>0c`]*0YZ'崱-t \?qnj_nguqc5=]sigy鹾qQQMOF9Jt5UltOIkWR\\ؤ>%T֥ug;F,А7^:[wu~`0 ՉZXR|S\2\ݵ>@C޸zTYjnCFs-Z7_ lT.֭E׭4XuokiݪА7uJ~^enJ!A_'WCQW#8](@%y䊚 YZA;40IyivM(s(|X:AT2EGaJ@걮)2鬰ߕ(W3>Fs x-[YcgMf!OPrVq YwԥY]`V2Y)Ev@O: B/!tGiɘRo}f]rݐB//5RT:VS%;R 8Wi s̗4P myRqҩ\U$֍)h|1QgXV|4xͿuBC޸zTYWWM/;ѻ}uksAhw?KVq-S\0"Wٮ {ZqPKp J嚇՞}&-Rd]DQ\lykb"qQQ Rq;y6bM)XGH Fg}S7p06ZAP&-9)5![kYӡ Ð,ka6w(C&)1aaB5Tخ2R9i`U`L_zx) Uv.Unp\/r]J8n ;R[K5HI!D*)$.ڋI<)ixG(QuJ*4\AI ƝţE mRfGh!3S|EP$_ Wa^??!a慘lݙDi{~|JO=R#K"]7B1[64kW~$$3481YG|p.$QQ _JBNїQG^RՐ.)6Odth&e:)ec/26fx RIKEBNUAs)GbK8 6v#uMKoed}~[1 받'zPI{{֬~6v6H>q ѼW>t4Y m칞6_>z%?&rފ?~ͧ憹o17zdf(?/"g* x@" (0p PQcEPʿ W' A-'57kϫ6]ٍ7F#b kEyym5zKFh']gp\٠-YC="0jztd:z 7760ѕ>$68Kܙ_H2ݿ_y w.RpU VÝ78?|N{ w.X w.yqnFl΅oe`J.G%1U< ՉvRoU"VI]E2Jp w>" o;>( ȹ%rIkYcnBߍ`e\z! csa/?дYt1`~37Gߘћͳ^aNsKP!!" K~/ޅ>ko^w_f#mE>]v2P&)aQĔT%4Px)HU"q8 [$,ˮFйGb ~bDbJ")p c@ߋ+A4LI$,%""Nea+DOdeQϡq1m,^Z0P2$>cF^}`FdP,;'7iX>F &H~/ V㱖 Q?=Ur4T @L~o,MF»JQw;AT) Im$hYM&VfTny ۿG2x޷`=l}5G( 3, [ Ό`B!\+12kٛ zЯ˳3dײ=KǯMd1 E\ޏz8 ^&jqjl,a܀­QCo{h۽H 侈RjRSf(eD訶DAoIqGOMQ~My_u]M(qrӒs"~Զj eā?9{ 'w::4]&uA3ewܥ̯/;hќe綬\vf"b$ZN2WiSp.[+~e,nH ђHN8n|iU)ph@(?Ꙗ`(Th&Mx|̾ڰk LN1Bx)a Epcmq8Zu8BEB)@W "lX k ֚38 m:EFTLATZD2֖q`kGm v5co#ly)l~:'c% dU3s+;sku_mV:/BnPa`7$A*zW }} EN6ý%Qo/EUԬUE]EaԦ b+) ^7OPK#pr/^>9\QVF/سnP^vx=Z^n{R!s239ՔO|q!X;]K>gBgt4Kqxx]F%\@::pwلGK ɥiKyC7FMuOW5?Nn?3{PM+k b#- 3r#-;ׂR+n]݆ձ (GR8pwZto3_.z)?<ݴc~ ߐgř>v/.H78HK&V T}Q(YlyɷǏGz( lթ.~jEK-يɮrS\;Oz,G>sC*#w£YXN!h+F9yZP/YLԴ.Ei8l&Dcn8a%QQaEjuT}IlFנB!WtZx"ڰRnQ75`;StG7y\g6hw1J1&o1D)e8!$#d>*0f9IqLջ[Ɨ9A=ڤrEfR8FT@)IL\ %: 8;S6[{!TlAfH}"yOB)Nvgъ^"h1 ASȷ^[x)=:=Kеď|چH{igT.cުxȏ)qV:St3"/oƵp;p{3qN=|)$L@Fsc0Qc :((Xow}*e-%r,&1&ɬ  k/Grl~)0IUXQ2" .,tbsji,֛ބi͊԰i: ̔'̗Oc+f0rT+,`v!҇vǞ&L7 7 ctEH0Qɨⶼ[6)xpZCٷo\!Y~ .W;M>.lmhFԬ}&wHRJF2!MI΀si ÿyɉ i(>9p0/qy0ut!t T)QRҐOR=qHq"4d2H:v tsap}<xH絼 pMGG3:ak!t$:QJO#SinK^JȮ,4Q8n &b}frHՖ_p O `{aCڇ< ӀI%kcBz]vK f/#dyzz/|k͟wܟH H.2[}DG|RZB0{Y:𻺠\N;n|!!˧wn[نnV.<"HY,]J9JU%4NcjNbU axδ-Lo'4pnLnFۇ٘ d,4M(6Wcyr> ǿ5.Fۙ9AkArmv.?΂9a B:m͔PNĈi9'AgT,bdhK e!%ʙv;{"bCF';-FOv|M)mj9 r?fOb>7lu MClK E^%h#Nx\4Ⳬ e_y?;}'ʒX.&tO 1pH3u#zON Ts4$=!>04q>i4GAѰ5j@zCn}!tΆAr)xϛR*B[Eo .^eY_HdErV'k_Ϸ!WQ8+,J.*\i]_nNqMF.!^cWU:xU?hM'Bj "z`WIHcT4X%>H4 |o(N]Ң]Rӧv+>PRݬs4N?MfףotLfcµƯ` jx^R QD}khYw ,@)_ύ (׻^y976dFo*Ҕ4BLVMS˸:2fXt<􆋒E eU# =Vn뜭s2碅N Zq՝9wom<|KLRHDRd`kW 5#xr<7 '9甼yU' 3&NXFW)rJU\k=NZsDP%qQ3]ZNKeh.zp38mGℶoHϔcX·x8e1rg˒ H!PRT P@|)8T !U !jk p0PRZ1P( ӊ0E@gO=s=XO%E5Y4g2+#$RF:+%*'%U+.hSc mݒY^1Z5?$ҮA(R\,LdF[q.,V n i=F%eO.V_1$hv/ާi?fg?g׳9j \h!ϯ]b'f4s'.w͞Hǩ!>=,ϐ>rقɆlNJRjxgw޹<[2`kbaנ<#/5J%!}F29g;[^sx|A$Ȝ+vB#%)ѨZтڗ0gġ@ M"r|RA⃮TA\,[ SՋm&L 1hӢs- B% ?Abzw^Y))u(tB?|ɁR(D %*]dAIh3$EL|s6%vE5 tb0Hb` W59 o67U 3( M #w68ͦjkwIZhvZD'PqJ05%gQaPճѱ3mIcE.[Q#IHD4AvHLHCKܒ;dTz1a$;4(覡U{eGB%wjJjr[롚Di$h-vɠ2 h"TjBKNC@s\vG(7%&S%PogD&m`9 }Ȁc 3gq``hQ`0:dkZsR⤳y?owPU{.{z{z<-eOc3ƙer6Ԙʜ IJZ1fجXMt 6dmL%4DzK`jH_t=Mf}|E< 'EwBbd[Y& E %!юʾl[]dpI5JVv\P7mcv2v2YzfXta1R3ko8lŐVOӭɷ᪾ haG/Us0P%uW .(T.7lT! d4r a:訥*h\YC ƭ % gu".~F5O;iùM?3=PӐvϤ7w B,{#|5Zp%!Z |?UcGc %{SWg)qsxB!."mGp-}={.[r]([Kwdqr;ähpT>Gm֢=Z l( VX1RJD 'wSj%,:pN3Uk"洦 ̸cb#3( LXdGəI>D@֦ܚt&fCt7s$gD%t<;hy!Ϸj^HRa] RzF=ޚGP;avFٻ6nkK.3kw:֙8m4a!%X."VeG{;O:9b65z,mVnF)`>T+ŧg&P' G!>nE{cVCvUwݖz(/|wVUY k|ffwOoG&}?j6T{d>3IQ?;Iuk䩟@NmclhJ YMcQyYC.9޺->uw+)`޾}?4 HO!ӷoƮ {_sΆUOWJdY w Gɠce1?foC݇J+p4Iβ"x.^]tpI/&sګ:8ڝdx>29O!J8'u^(raK(̏cBL]`Cz{nL)NKnq^ 92v϶q9="12w\ee>Y&ak).- `W-Pg0,25q WN"(n3r_֛a/1LNZJl8;C*d+&0Wr_FJ ·(fQwbec`;u9:R=ka6nnx'cD5[rÌh,-F)NWF*.;jF-9S?e OɕէjDdX)ц.>WUܥJǡzB>yPo.wz,Kc7o.3v~ju簭WW2 N[:$cEy0W/gc=ONʮLPB4'bB6o9W}*lJU-I ѧgL㇝YJ;FUX_UX_UWF $-хE}1@ 8p>:]2jn$F/K(T c~4J1o)C9)$P2b\X)0- <@CbZ @@Y=Q ʓ1- 64g~DNxՊ~6%&̾nn=]^gsk<"jwȷoן=L(ӾUjxQR˂hZ ]v;T6uTujUJ/*O~Nā#Kc4 5*m|{ɘ:,PbSp\ cWІrPo=Xmy97]Y :7)ER@1D0Ta~DAYb-]aH ~ u`1 {+1iܶf]ڥ3{9rRXgv~L erOãNui.FO{ۥMB@n/~hwb9]6;zCv6}LrpcGw].ڿ1aP3 ^h#!nh/!.?08ې_K;{qӞ!Y_k;9BSRBcܝdY anU~:Mg~j>>~P_COel~|x/$d"8I?N-6*ySūlZ͠2| ^Cԉ !*rRy-s ,S^mMݝ ^ooVq״nmQDCzdsg>6RWϼ%2 &.' /lNBMl <6A+WurJ"wHrx݄mkj}HDMVcF옕vU hTkc- \bԴ8en-0Lvi(T,guŠ~}\I=}N`I;OUUJi؊o3eݤ !YIS2<g^N)]g"|DZX sqQ.Jh 8gEc^@;s7E|l`dy/l7l0KY%-FSzgҵfG,D2|[HD"4Ǔ"P3F=%}XP7%NjdB'qQW{99x,~Hɹh\@?a<|. m*(UPȫWU#kF.9vaE(d"((ƏRPŽ8R_T0/OoI{6t~!/&U Yt^T4J~( 0H)E1+Lk e(pwA˱]C{taq1.4S2(W,+ Z'@EV%Ub@E<ܸ4h%9 -%"+}s`,ڿo.: MHp&RB1 l׀j ij:& &`( P;V$9~Yd2DT e}w6opMpy}Q*ٸ~:3VƏT2_o&HK Y I/r3ݛ u=0.3߿|\ܷө}'>yl插 a@~SXnA [Hxݮuh[T?Bbs1@ ޤ ;,ѐG%χֹ2xd_CD9M#̰Ө2.p515)DATL1 |Y+"0QS'DñF&H|${0N0CX_4KztS!L˷{{g,y4eg7gBv҅_yQ!&}EXsTN+V"挰Evv"uay {:Jd_;٠=| ˎk!O2 CEF6"_){ =xl@z]B&}<@qؤ0{EJoXAW<Ǹh ֣S2v_7J> C3h|mg=b_ lPeF} A$2Kw1\i|BxqgOFϧi"_ x/{uN0FRcf< @ަ8Wdg41W"܄Hv+R)@ڡ>Fɗ0yюy6oJ3R7fn(?hhFV'7齱SB,tn;XIqڢ@;H 08v9@J9 VƐc9G,S7Ƽo!By3a++X!@$/fCPacL-t"fh+ 3Qq̫ԁ<ٷ+Q|S;1w >yTD `Fx -Ppt̸>m]le&pW7DE"lG >;]/{}=vny{XAɄ"9d\5_gn/Qfo&BI^Q[y;oZnjG@,qyV*'vLU.WcMxLyheg bhYzGh7в~%Ms"C Rt|TqL((X %稯czo|{5b阸;9ea%g"Ϲ.PpF\jyw2V]b,D;0GnN  ~.|>%EVܗcnghe:[ gd BHY%s^6TǎxpHRq"BzrSGjKv"J4@Xl6Z6Bk(f,X @QBfliS%A?+mĒ(ѡ2*Fb ZVנ oҒdܹ8Y,d f>֔RL;ưc2X@,VM0.4Bh_;rQ?Z6S:1Qc81QPz!uleɐK8ZW0TN3F3&E :B!M0 #%rZIU2!"aǺiLyNTo>ʪڈ2 k~^bzÏyq?*qīīīī*mN"E"9Qo$rVqD$P\@q@(C͇2{ ~{ZgLpdzc#>mf3H@{0-w򓭼$|G:D8 a2>" 0E5g4,&4J6)jރ=A* ؼCqR@ ZN T5s` A646t$L(C\ҞB#f%3ul0=W WrP.Ix7K~:ۤ``s6M?f7S /o†BowOn{y!|-|kf<¼[^'~ Dp$3ХR)t_Р"5T2Rd N(Uo4 + + B gzH_X`+0৶{{ yJlKlYY%VXEɭnذ$*ȸ2"ļ[BR,b|1a(=6X)Z\ 7)QE D!dQ+am4S㦇NW=^Sd=22 Y<~}I,4/KO-!Yϭ wp0:9$&}Jj<  ' aqI+~yêBl ݿ[sWCO4-D|4*h,' qnS$KL$@Sd Z fj4! Omd@nZ->jtCsP=0ddM%4lWkzVKby.j h][6(E'ˬ9e W3v.&%aVPzR(@Zg[fX$ңOKmAQāC!4:Fo7/*r.ZfZC} }H@MfXrZqBs?E%(rL;.@ ,ofA132ōDfZC*3&` 3Fƒ` j< Fa #p!@7`>M;̪  AyBrВ6Є2ă5Tpd(TĵLWi˙Jy5lsƄL1nI0Y$r͘6󷾷=F5C(Ƶb/t]69} 1ȕBdXJJcJQE|:\+rzގo-SY1xr{b8 g\.тw~Q5(l8\iB)# JFo:gqA}^#Bq3OfLM80k(\׶bj?CH\04ˈNh^yѱ?Z3ګ|9^TM⹉Bz^a@+관-mXV:/7l)iHpG{HP ty^R N{&Q" 08 /c[٬7AESDH2)0 "(S˒j#&HZ& >yI;Mø#)eCDsX T9xF!)`i']LVUkXKI.{rgC݆_,~bYbx;|qFɰ;;kO%#;~ `Ȑ~hyh_1 \&퐳&k9 SsqmMqRPFKhWPMbS^͚Ef9䁘I\ڑϠҎ+8ws9ջ/jnƒvam9^DDw8+#J?=9ZϖGN~VױmOWw3P9i?ÿ~rov7MgٛYƣrnh:C̅h,z[O?fñwhw_:Ȃ:Rkk Q%1DA{o-1W܄ \R5'q¶^F.#?o#qkZvG`Ԡ ~ FU#ͳ©5^ƧUZ7jX%uln$A@5)b)lSvA@jE$FgpSW3k sj%|@{0i n|(W4'-l^j\/J'"U'!ɼc~]i{:¿pӿg[aĭǯ}JdvC. \%[cj=Ӳ7ھcqĻ0НLE<5ӧlgq{i?RL=2ajVϤ'={fE(-Sh[/کɒ-|׏%kh-q>KQ7OW=xrsY;{0э^?=j'gWCw6YFVl%H9"bm/-!: ^ϑANO"W;NpjR嚐R&M̙dJbX<'1 Uc2imLDhaޞ'*Az7w當Aɓ.Ukd,0yz_'*ЩËuM_ *]R{+,Rri{jZ Fng,մ2RAF *GNy:iot%^sEdu*)h&JE-x+Bl#Ze+Q?7M/Ft7%'j̛퉇@5?pL TuesM24jz )#]&=P"Mw:"Z@%%bH&% 'yAs{b( #B# l+lz2l]㫅rȶ4վ Sb*GmBN9\kLcCE(7< `m;oaijJAM@Id)9֮6YmnkE+UtE(R%K'`no'+խ_5Vga܀r rqzOY|Zt2]8=9=~{nd|wr#yDè|hV&b4Fpn0%u# ʒ3s7+d4ήI(;K/ =+<@ s_CF|U,b@z9l<ҀZKTg-ȽŽ#$-^$vy9Fv և\zߛ1 bi fqiM9jxa. ,;ʷH>lW dDcpvthk$]w.wF>UbӃd,Iu@w2_E~v%њ6q_wmjRk Nd+Gic/ՓJDGٹYE Њޯ祏uO=:>Ϡ_c|X ,j)oj]FcI$Bs@N?mHM]5Z{}o-Boss>f{{Tݪ.*Qgh1jW`iS4ِ-xf.+:+h>࿻'8vͮ)wǕo\옋g-~zt|-碯;X]ՑZOhm-h7 9QbN9)AVNU$ìpb5ʊL<RRj,)4Nm9= xns}Rq*8|dSv,T8iT9Ȟc"ZPY(/pIkA'kf!gN`2/xRNhc &j;gEӞ# i-|`|Sc,U}^ɘ8` Q ) }-E[ 6JhIpli yp^VF*Yjؒ#h%A .f$QQjV@8# ˣԛ}ݍU*켢VPǛ>pAMs v{I$@ִW$IV5?xtvnܳ|!WI$S&>PWN#uZ/ew9E"w> F~iAcj+B|K9y$޺ejiN.X]1E肒~e V[ Pח!|ϢK܂ #)EӐ{{ׂ8<Wvw%8.h#?TӨqfЬvjT/ =JNf| # R>T*}^'Ç ${oTϽ۟8 n@j@<[d.,KmN\?e]H.r*5aڂtRUpY FGD2YXbE!r <[X!TY*^}[ŽXz`>1Y^^AH=$Z/z i"v022w6I 9Q(ݨaX0psbeG67UY\c;Xg٫ 9{.׫DƠxa@[Q:_}GƬrvp3|iT_4ϒFYJV>A1.YYRBr땜o]_w4>NZk!eNƣAjY=?e+n:,64]قE޺SFf7ǶSy7//9wN0T|`0R B~+ Q84 ;# {|w_-7qn շD`/@GL~_}}6pٻ޶W^ [UwKr @wS@$hY&iԡDP>06vWUյud$ͧAx*֊'ָU=R& la5qK[|5y33gaN/rooM?S܊;x XP8bEYAUm4u-4#e0֧E2D 8cI #9cs^k+B5ln-Ȼ8|'Ot).e\VqiG[&G#Nxj#9ŵ Ƶk$݈IaMod"nЬܲhd/V-nݳ ڬ1r9G,`秧WŲOV#/՗^34 3s& 4@jݛ5n kf}W%wXw!esDu68([s ixe"]FjqKZFk% '*ie>͈щ K_)j_Ĭ^mv:VJ.Ẃ|XX+;D@\5.)x VniWWRX >  bw\;&$8b_.Oe{p#z2y i9j'P~F>L߬{+JdOGȩYb{55[H38~|e 4e"^럝#o?l;给CBq+]q_l`o~w7ɷ CHOu-2$Dɽ^[ȅ^}{r鐸'GDi'˺xiވa) a+el84v B xrp=ɧHbDyҨJ:S60P~jn# ٠"T.^9('ɇuu5 !5eg+=$KT8U~t˺`$8'+ Ewu dcZd#`zZ΋F5Ûw2}Jj(-Xa)X>Hʽ蔆*"41s;_wu5 Ft+''@ Zv~Q[[>az)#\.HLΘ"8dƳ'ָ7u̕75V5G>V`xxjuN8ui,랍IQqЌ|Ll]pV:GNW[اGx!6x_uC! }&ӕ‹l#IkBSn޻-heշy~>i7`z@ԅ_fNPpS6J<EB9Έ0Pm0d`70≍)@W<@gu@kOgɰD ǧj"M@#ouՎY4Pa`ޢaZzgWEiFx$M[ j_l=:eL>lЌWh55xf?Nb±9 pt{_"U9FK:0w ''-{b-m߃툁  IHꑒp5(lʢPq>V)ez86q=Gu}ayײ4"MMZۦRmMMΑGJ) RMZضP0fS}U2@pkǑL~@d6 i F u%?e  Eri !$YhBr(,R YU&HJ'*E䕎JW: OSR%EU)t(N9}p[&`Go}0̞.*N %K{Q78%eʬQE 6$ٔQ Xq()S7 )V0FH*%9;=0Қe s^{FZ ߬WO6{ȅDG|-^]d ,h#+~T*U[5@x{-4 Z9-,}*\py#:Z?O:A׆0(-}dO[u#Lsc"\(ر h'۴mIr90Y{.]6~F2E=b?.:4lkqqbXgȥaI94X*/8|kla~y7 n=iFk{EVi0P/afW!M B:6E:UJ'Go֙iG9yb#16{bC8w#5 (#{]yU7Թz,~;g/rgJ[EOѡ xN k!;tJit?ÇڣeUmQ GwL(F0OQp #QjkZ+w(yàhg֌̮E hY?hM4;{.m$& }^ ` SQBPJ0S_u-&›t7Gm7Ѽ-!C-t|b=,iL~wvEK}}1A/GZ ϴyLGnA/'b9j˭9k&0!y)"U#2Z:5tAAq)|"K&1dß[6_N`7$UHfVg*9-,V1i/hbADn3*Ს_褢:Pp{dʛuZЅ{F!-C]K;Ѭ}sjC8c#ְPǶbk{K XmtJV ᙫ4g CEڐE?{'庠\q*%U}!Ad4'r;;Vڄ +eUӻ?Ouv]%KX헰/aMn= =m˞Y愇 9a%NO-$>xUNtS(Ptd]Rz5*d6@&β*${go1c&E4i lx ðuhn=:V{tXѱGp@WI%%}O9 b9I[B&E<֪4)^Uw ! c4@+uԯ5\GyN Bg'XMH_(I'X>޽LjȬvXijU&0xhr(W&t1_yĂ@H{)RXrI~W^OCI%N;zO [0jsdjvc"GIc2 +P1cEJ6;d@Y{w`6޵,b+:ld&\Ȟ&Bkoca"qϣWU]xPWz ^dB 3:jЀ0-!Z.E\B1]q%J3+s$TV{:X@ F%(E .F6^F!G9˄\ܼP6}CJA!`~&6h 1XT$LӠ] .e *{ F<(A2i^ Zl%fh!5[K:;-K,ӲN&il%P%g^I [Դ}r:X@n0Uq= 4JEwMHߔ .3ca&\/L cASx9Fx;7ȲV0),; 58ͺ {mqhqqG 8Bwz^k6:Ø~1џDvݫo1s9kE(0p{^z?wϻz77zO?}hZ~zqtbݷ8 Co^^10W|p?rf ѯ³M\ .?3W_gӻ,ZR$J/MOy.Sä0h0:xқ/j^Z1>6~Lx0O9+QvPWjܴB\Mobxjs>_i^0 .;piyhܲps6ޥvAɗ=wn?f փ^>(^%^`]]V('H.`J:'9:rw|t@C?; @۩ W/LؼssχЃ^~Iñ:o~-ƊK`Dޜ;!ٯYoxR?CU|`Lk8jgxguλEs#W w:_u.ϻ=u׾6 #^mkmGG0;ZЧ5_Cf^HF-ϯ:ːB&WŤ+:2;JDEbB [JKY0"d2QJ s?P6k]GyX ^WޘaC+l ˉǟP|-Q>)mǼۅq)cʿZp^Mn}N_׿!yۭF}q"&kA, KM-FEFy46CZGѦOQUhJre" pr@Qp#@H-l%Qj-nrKːV"2-Y̸$*¥W]`LE5 2,HEy8C>pkpa9*+] ł )e@`2V%H3IJQ$E0Z[+,20#r 5ӌiI Ƃ;mE1bKRs:w܏,~$A܏2]6[miIuJ|⣔O%&<bs+(#0sf/ aU >dDSŪ[VvFa0w~e3X-edqJ \SZncѭkjkPms9V2!QХ )`91=#4x}Fk 3. )FU8-Zœ^OYfShZe9 T!3,XXrJE$,I,, YI#U[p((&}AE$@.e EpY?7RIWA9Ŷ-Z4u\p -ʱ`ZncQTjaΕ+X*jtr>ۣ{rA&âdz&KmFFSFXP\1&aQRÊJ8ڬ|(xp?@9٘\]GIŘ6=1PT!W5UG+G 1By|n%Tz˃~ĕR\R%UN^MY\{9X4'䜀fshvLѡ|t,""7Z!)pkzcBXyP(JA~ ,)dmNm;iVwRTy'UIw;u2\dy;3 $4 ZIAys8ieN'ۢC1%mrv}f!{q&HB&L~:k3xx~wjb!׼ixd G1ONKŚ{sccPJ:cB8˄<n]1*ZmB VBjPP}2\ ,gZ3 32.Lˈ38|<:1:7oM^/xt {sd;L,ԿA{fzgb (^[[&2ZoiQ$^8C#K(QAqC±&{zѭplc+[ ǮN(3#6F͔*kwτ!GTf @H&D$H%epanowkiJs}ry=^fB EE؇&[B_؜@1o0 t/|΁>%H)3A[ޠ"B#w>Rl\[Ʋ7=30+q,Eщ }:8yaS">C9Id$1֤ڶ!0D傳Q eF{H!1)O6Ʊ\bwp% Él2O)2qZDF֪(h 7L0PU.mFz[dB=:uQTmOW<8Typ :)D>?38Χ*cJLc2fNGӉ3(5߿gߠ2^$Opٺ{srܤ!]쮻>D-c>o"bN * x$s'DŽ^^18u-jV ~!gZjV+Zա[̰N]̾؏J̆A ] ꚱq3Iat1Xe.CQ~PunO ^K#ܱT6Z4FI#%s(+eEeXaO=qi˵]dB7Fo*|]eمMo*|S ߬L(3m]'sW'>Z2S 9Y%ךDV? #RUuCE8?Qھ"`i'[K0PƢUࣔ q]^V>%w1=i4P A&I) rs.ς9IlA(a J-KMc8އѣ?Tm?>Xm;QkmnFXph4WrMdΔ O[L)m'Noz"$>DJn->p/t}?[?,RΝZ6{ӝ\l"P"PPrEsU(3d|?vp ^x>GzfMEah4}YNW->p7Ӂ:gۜM-OqthJ9y p/}$u(9m/!cQ_/!֢nj&v\]꺣(Vn"_˅/^OooB3ݷ[I#Kݳ-.arp8MHvwGZhVvT+tQ48{5KVG]CgewCt$^q "a>2.t[q<1[sLo/xc.H^z(=j֜qGm!ddKE2@U4VP$H`-SMF5p)Iwgb2_:r;PIny=s{\CV퓵RQgP餪*$[CVڐFdI&,)]m}/\=s~z7U_ W} WSez`5[Y/e+H''DLj;ahB)[f;,[@Aw:_XSnfNf]fl7"@V0!-9sc:l`]r%Kr#wl9\NZT@ AҙJDւ) ld1-&ݒ1zA $Wg{A ^'IѭbNϓWdNE$ tC#8]c=B8|}/ge`nKe wW֒G==ws`’*%qf˲7gE dxjM,Lmu-I9ǔ_js}NI9Ic-2ce>"u_p;RwIb5+Em#X'g$^ض;+Fσ-UFo+ =pkmY-Li69X 75s*W9Iݫ^*W9ʹ=Prw7uܩsJQ9!uR"cd&j tZ#X(enqciV%XCߚ|ћw06|Y>fL,"țCNIe)>* r"eڶ ՆtҐRr⼛KߺNZ,^i4q1eS߶ɚ>ݘ(3ß~2]=٫O:O?̨?~KӋSIﯦ)oO_>_v~dޛR%g0UOϤ .l6zګ؝y$EuY+.ꬋ-^^J|{|^\q%tJ!:pr.K]/?U_ U_>uuVo+I~{)evoX2"D%EF.\(]psݑj^_h2^(?-]p \NDo_vYɆǙq>WGG{b)BO*us@QVLd"MbU^0E_+i|־W~o`os#YvU^9Ś[g {qki+) X[f[LUr*B_groUv*(X!o6ߩk=hֳ D6!~ڽGi.n'2p7e`/ ^2 eEҥ ȹxN-HD]%`>B&$AhAKk \+8e'jGxYWlo?tM<GIWSukeH3ָFuA|iDFпx:3Q%-8YwUVfoL-cvt r*'Cɣ &"G1gLbk&Ƶˡq/l9kɓ[5]1nx9x1E(qx2KRR' *b*0EPQU.,ZMB.Vf5-3g0G#*E1@Yp}l{R>1 $^O&Ș&xǭf` `*<ɡ6s,vF1Jd)k}6 0 Fi9 x6&B76o(oo8\{7>}s0 Y`U>RlTA.ah\rIdQ8lVV/yVHN{EzIpտ{0^f 3V܇0Ioϰ۳WھL3^W?\_-хנzJaKN"5I Dv邤a[2MCrJHniÌy˘TCD6yH?T :,&Y6덴nLԅIM:'ʘ0oW^@6'[z 4t%wLyuTJb΁tRciH(Y`=2)Jl'*e;SvR!DJh^v!=fnj;J+fomzELJV} L6U"%p–߷$G- v54f5J$(-Qa|r.V=q'sTq$̠ EkX.N1KՠTxRG F2zU yP\d}YcZžW֠n54 Լ9le뻑1Gu 4THUP&Z]Ib2PJ692_%SlTF!jAV^ɒuT9QF#&:ޠSl|_BA+OYO"TIcQ6/ \bs P6>ŖV2T6)(ΰrQs̳h9Wr\lu&YʍLPJcǃYnr3pDxsF`mE+``v~VZZEB٥@jUI-DWu'{od`vzޠ[Ω 3$?֔}hLӢ*8%G0 دK+^B;'D e FSEMZ_bao3 Mnqu%f7x=ѥaϩ Ӟ䰥SH*X+\%_4Z>|,I#3|XUUbFuK\DnpNckyT<%wXE[Zq!aHC뙪V%zۓD]ilj0]$[RPw+^l#ꖻ_ 4j7\{v5Dw0ݴkN2XV&Vud9;Ƙ+nY_!~`zfz*0/( IP%hmtul Q mU]a6 %kIg d_zg`UH^Q$ tKB1CV?TLlc8r=.hI,Ιq%#a\XX` z^_^2_њ5)KhznJC'n5JLROfnΒEfXX\jzNՂ1wqH_g pM868 =^FWնLĞnP8zX/mV 114#MTg2_<\6 ^ׯn\k̅IA[]^Vh$(yq6iFrQ8W@9rA gUQZ#@Di"_^`v\ tBdvQZ r(pLb<`j `""7ϻj ]J#-LeQ'V1pj؍HT46#-L4&.FZt{#;,@=mn,ôG8RK?gzݵUv&Fl SY^y\=4:SswB(1EzA5]^̰{đ+" ?bhe^pq $ejQ5@O{.WcLBIi1}wffMsc bhowl!BҚiz# Q$*(\4 z]SG mDv_f%F2OR9ֆqrGp=!frn& $S.7X}YG%̽Mo֋);M7C0.L4V Cy٨Jy _pBbQ BlTT`)k[|'iΕFL2"ُD^'TTr3ya 5:A*YZFN<F$5I}ٹjDU>*D!*L2DINcB e jcNp1Sͮqi_Hҍ >#74J=}Cǘ0ih]m?7|C*4ek鈴l3͋!CրQO[[4Fݑ)ѻۙZv_U% 56O \^a6`TФEFCoESđ`)& HJ;tD[g=X3k3S7X]Rɮq@L̀_ = ޤL%FoGtH+L:˕=mZO75cx{.is~ w(+P*lwJȞ 9Vb"L!1R wmrR/;,%[)%l%]n^ZALmyy#?5nQYU `%%t5ՂCТL dt%q}Yɂ .)(íPZXA󘊪.]UPΖQ|g; ]QonSaM&Sv Y \*)mnTInĵ NPY'cu>@#a^nhd(P|M6JC#L0يPssnʡcQnF1 ŐlsabJDi!\\Ioȕ#JM"?RCt5fccyzҨ17#[۱0/cdĘ>H6:2X͇TW;'M*TBGQ !h<nsGc&^|7G<+ ʮdU14:af]aJLS4F(j6jU՚"UEA 7+/hIO<"͈qd?ԛޝ)C؝5 02S@"]s ׮ TNnEI\f=݊R~074sc=nJk|"Ei;WcP Jr&vX"cV7C$k냥@ј %l }0KKSv}r' [c@ͬa tN`B|85hR)w]6H,:LF`ч2Z :Ar"(B5Ȯ񭸏ZcHXɓClA~zw so1hbߦ%XeoJ&iJQB{QI4Qy1݉Wc.T7fFZF%Op/QDȴm3Q{bw3y^h\qmfyyHC.Zjb篮_ꢰ$F,@׼MCT@o態Zh6)(_;;~VͭNM* iS3&l)&9A˻ IQh,\DRՋp*/B#PRňQQ1JzG93ZC;no/=.yO-ɯy5((1Ng.'0XW-n1T_5 ")Jjnt5#)Kif\OU`9@wIogi80=]](m`qle)XYOi1؁]Nk:o@{wR=S.f`KVoz+&/Li_aH*ҽĽoqy'v3y1hA {q xpCAy'޾Eb1TΘ wy=F 8aRsJ{KJӅ&Y@TݟEHj}a#h:CI4H#alWs9i35JbvKo6c|?ngyLwlF[s5M`ٹvOF5F{P3 }eI Osّ΃sJnP1e\7rL\] [ 6#Sd3ߊF` &84jtF1FtE%MŢL*;/螎 rBb۪]( ]kT!1T Q"٢ǀ\Cw= IhgBŹs 3}X+WZu8 xҌjRl²sN{̩F_ |Hv:.2^5hHr MǶS7EoXG..o27_&_ܜw 2USlb^]uF/v8#G~< 9^~AN~&"X>w$\Nɯkaw']s_zV괣ܯ' 1Wdrſ?t]-^_zBo^қ/.Wk!+U.^%\m4㉶"i^"w5#B)I3 .Y0uQUKϋk_^ iܦ-xފGV^ Wp q,wnwo.޶ogk͡&L'j3LT0B}Z3+zjңzڛ[F5Iץ4j";u#ft=Uqrr{' )Eelӟ竓[DeG.o1j{ ݪ6uةY'HjJ(ur0_rFaw|u|s:3z'Ԏ!A.@8T@#pʹ<20Z2.QdKp:-6xp e55U!7u bz[RO(]?R@Cc'M-?7Qlݯ,e6%٘dc^yjc>LBR9/: h"y@]QB@])Պ"/߈:uqiC 䧮_fr\-L+khe}d|Nȣ?ۈbzxqg4a6Lk;2M]p'l o#s\@P-;7i44cΘqInz~9ntJgz&EmE{TָK.j%mYCX<<^\%۞4if7r|dդK8GJ+DY+n㒣ij׊h+SUK*p'-}64 )5ֆ-.G U0)?N*ň¤ЉR sDfD*u}䫲?rͤ_µZO\9ZFmlq@9eVFw#HFN,MGkS Y'o[=dkT#f:}@̒EUjjZ?Ix4nMwtRْ"FL-ӏeA^O*r}.^!oeNBi*Kr2n^ N<E9G#l2 AZnfA1~ [Fê;mc=;OӞ5 viE3U:aʟs ڛ1"%9y*&sX;RtIq Ųŷ)A(e&%qJ p/yLj;ˋX3R )Jw?EX|zpZVaIk$D&# 4d򏺏EwzY^kV,o3ROQg7Sn;~gߑ~JPSǿW|,+=Фp RlO0ѽ5R<]ڇ7=o-xF;By{)wӎhJpGh1czg_9)@M#F:߆CT3*2E|Lu&hV~Ns* K *&0R&h3BnCzFa[}JqOyЖo<#gܰ 3?V癩3Q|Ȕ&ㇶ='u),M&lX,V^TvK IS>sj<=WBO=سq?cuc1|L3+pl^f;QNec5JpaYl[u~nJsLg+YoPrY*udћ?]O^k?T_nM5ӯ?'&ȋl9YT˛ϟon?LZKPv4T4ﲵ}DWP>p׀1id|~z[_w~NwtaDQ@+mJEIKnbs Y0JDE+)La^IQ"w2gpu7yNҔ`޼JRh{)B-ި 2#YNX KQz ?wrre_,h9QhJ/vg>n!ķ^;,y2=I+ ss`AS#9/=#@.Z-g /8B1m} )r\Q1hE9L l/tQQD}dNjD2UkUrγB%H6iIGo. _։%R;':|` GShĈc`q$`CܴpL}<06Gc_GBױFHȐb Hr8i$ 9s7R}ya<ʁ5FJ'2N 2*(F>։FB KXȌ1[@BUYm1LP-VGU$x%E~;IeSM8 DZvu^yÆQx^ ":+-"\Z(\'.xLmnpW4`y}$Il]&AKƽϠKZFS~U(LFV~TFRqzEdȀkeًQ.쎳SKMiWῄ.3VF\֕U-Q0+PuM4g&G(TCHbMrZBM~ѕ4G~'r֑>L=y:$d6,d' n!(J;HmQ~/?eݓK;I&a`ZW{dn#){(Z3fKw-![dsauM2P jzgC\uy^ )aiBǞj f,`U!2[J]Tu4}9<´t( {TͳvRHAK<?˻0ڿh B:xGq$TsM:SC^$fD)쥮Rb#W a4%73F Lk?q4 'LfgpoH)bBŴd@i&;Nqa-Y&0sݕb`jX{*EVhue5 :'*k :%B( fj4_dY*k2bj:% t:Эt!H-UeF,gJ2%iQ yUJQi 5R*,IQz6.#,p{p&$4 )Hl_w: pr G6<}uYeeݡ,wMQ"CI+snMaiyiR;y0=ւ>y'p)U$w٧O4,JUϚ&T{x=a|93~)>m$ 8bB )5%EM9R{ɷzj~V4sq6+bE&)j`I~b :H(iHV=^i=ZѴ{Myۚ_sw8mDw!8}̠h %r$DI*c]W3]1^2v@uy,{y`ݓ4t{at :ѸkaO4@rҾ1eTU?JNk19fr&ei|>o[:VҒ+k ?v׻}_mg5̮7O߻I[]iU4F_5]6`v:aɌl!E!D?_~r(pC %sVM!w8L^O"qKwГ>mϝ3QZ7]&Rb6ޯ͔)uX <,NNXȏ>n.Bjۉ ;h'+tS"1+g&*mjiE]gVCT{U8.j0<5'ؤo#dfz[/ڞGnM>U :qZ5(/2Kv甝s%({+tE'9UR0Cƶm&d>ϽL!Q_Tu!52^fdX)b7)25vD;a5GwQ/7:dʻp~ bQ\.LZ-ceAN7zӍFF ɴP *0.bgP*6zO.k!kJu@hVJdeQ`Xn7:-6r W~tiquy7M i Oy6vIXϬ=.݇at0H8ŦƦ5Ufy2@zqA=lކ ]ߔQwDr=S$QbC =y^ rYTPԪZ^hP̵TڷkT6Bcyrd|2huBuq:wJ'$ LQ T/n$N/r%u5=]Rw=]\!@*g_˚ic%o>Jƃ4> +c_{IBH.jjtH] @w;."k[x<,yl* ёZVT2ɭ! D&RYfx 9ײ<\ky;Jz7^8 &D9l50 d`;HXLӺΈLI7#mѐC XIͭv;w%ۏLgǽ{pkևAknwWw؅}O  XTs=S/O} g-hk˥ eL` 5v!{4NOQd5ɜ<zd;>!, J6 m%aΒxn$J΄Mrv(*R*/gJ 咣FSZ ]1DAEY^0eK3k $ժbՑTJ}9$8J&7xIn$eA) a0#ڑ}C8kn щCJsX/O H;_Oq#\;hF#xC!+k:RhF8JZ5**/&pF ]HxyxIP2Xz[DޅІهk[nKEoYDk㡲C8d+s:zFZVK +l i}Ƒ~wmXƢv=$N7Xom$im!jiF{8#cJFVn4Ç举ůKbxD[ qvr |Ht673h3>&#+\~b;^Axrsv~ekZz|0؈nnī;m\m~ F8EB|½qjhEQk_dgi f|ZVtiz#~_D ۟-3 H Uvs@n>޴DV#w9'6)6[[~ǾGőx6D~{,v]DzՃl֕/٬k%aa6B˗l?w/Ⰺʶmk E`Taw-N6Q,3͔D1(cZ>dg~s:B?ͬ@NmRjb_o,}KZ~XVZor+K2xQ%+^HClf9seVQ˅SXE𧻓ٗRX 8;v3my_LPxnR:gamgr2@s]ĺܦZڎށ&܇qG@_2؝n6r[◿Rjl9c +fL!;}~hjRY~vB[#>?;}ݬu;E$ V[0T^ט#r2%%)R" ;5 J nɵK`-3vg"jo±%tQ;" c7Αu⟏*{TWHưV謐~2ذWL~ýKqF>[@rnt~pHɆ$}~n|޻(*b ` Q\ t2 ǐ&?oF@C7} A΢Van/îIo%fIZqgˢ5~ߟ@ĚY=s-?&ʲ0ƅ,IZ> dÝTaw699,egp1{䃝1HsK{$ 3??=yyy:s %Hѭ:ps Xߧ GgK%`ƽ"g0J?߂">!@U٤ *_~R{v?u5ihcr뻑uyT]wrׅJp$U:Bu+K9!" * o+QPtzWd{^IX92"2E/ߝPF!_>=}k'~{`۠i~9|8߫%i.ʅUx7T:< z{Ǡ7!(g}_/.\|hRe4O'E{2~B5(Sxzjps~<:7gE[\P&8UއJXnɳM~../ujPhbgvfU+q.A2x܆jJһypMz"B20M2joi!S;t:Eey<HR}xV%.\>{f]>`$_-`5Q; D7uroӶ?hj4,dSޙ_+~*ܙ~P'0Er@Tt! }sH7lh2MtPYOϟͫ'PRZ77;RؙlxWqepi|顺ex]y*z0+>dƩJ 5fBȬe~{>\5 =\t(cC#̜WFFevjZ):dcX^A / EizWG+$o3@^QDW\<1? 'z..&TQpW$Ӕ^EV6ޑ^F}Z,sɉF` H\= vcp=±Ę*;HұS.ph'<1qjMB1oQ! TT'*2dowLwLL_*e }%c& 3 :K zA5S- VP6.֢峇{/ɜͼ ϊdZzfNlҪeh#yn&8B:g,h0F G}`(@Ld"p,u{$LTbn9L9c0kJ7_JC}ZiZIVQ1v6ʦCΎ'ٹfg?wVL\pl9\p+$[|(;*ókVk`!*eKC^%o f|תZ 䫷n3 1׵b6րZKUABMX )o?} 8L'a\x{&djhRYnEq*kUf rQ fpCvyl8``v@J$?*^oB0Ă͢uh,Edh:5oiB9] CSм,eм:5 fkBձ4m>/&褍؇33J\᪳-?-lє$Oo9ĬMBo~=޻/x6ŝ7AZap42WZբ˛g{@dVi~oxv%-_'/f43Z!O![a)artkp/LLߊnC~ ;^P^I1Uc^,)KNY vRSV vSӚ5ijKo"7g_%qY_\ϑ9 Go-wZq>Qj?tmbhuSn4X)c11&Cȳt2qc[DXWd$/$w2%ɬ#|lQ]R X@6Q<( j&0&)k-\E1614 -6ahs@A3+\H ĜZ98mPJ]Fcs %* H=WDaH3)ڌdM/} <0XO0!@'@:a D VGyX~۸Qg#`-Y4*([{*6*GG+lh$9l09U1_S! E@J̴ C@Te=Z7HreE*~p8r}9@૭}Oq$#Ӓ&X TbŘ3 s4g|\+N4AIJxcW]3n" ׄY5jq'f>hމkRd(u[] o;>S"V-NHoNjw{jNS/\ܙȳb)A{&L 5% U!DRP! uX/sUBk|ZUYDhAw?7A܇?ZuVPހ @!xo7)9$m.,qrP;$YG9*l֩EOy n߼}Ƿw}ssoa9cւ^㛩o8999awefYfڨeSUO]FW!QR2e IZI )TJɩCLC!@J%hlgh2nPtޱWB D0[#6oPM戡7GʋniGv8'@JoAݬ؜M[ {VJl|U5c:F=kt{ɨsk7YnWD>]~~)}`͢-ϞB_J^g/6MeVbw?4191`v} ŭ]zN߷+/;ߜ_`'M3O'÷ LydƷW,͘2z ~iJ EgX(zӦ٪Hk+W:Կdˬ9sw3.4L-6s?x0G;z#$V,7w7"KC'[ے9TdP׵X3q9jr{ 6)޸"g]4}:ȨQ<0/Hg[.~qa> >1XxAEsz#ھRLr.Zhfwoo]Tc'_~wivnNço7GNy2|gp)}' NJ)myڊΙ7kym_.ޗ_td=o'O8yP#̺SN^v>Nљ*Tgaj^m_[J=}[bLx[`y45NjGæTJzy֨Ւ_ .T2~/Vn/>'?\b!ge}0JJ«)O^4QP/\ҍzx'ÿYy;T뛛<)|"!5{]Z=exO.'z\9V#ݗ!& tO.>ȩrnoO.7UYOʒ>]=貱/_Ç_q>(HxV1V^[t3> X&u ]iW4%N;^ %{S4i/3 :VȮpӶE]>~|}3f/.gj?Mc^IVpw<* W༦Zŀ*(גcˋ;O9qteݲsn9km޲~ګUX 6ڨPZ-^yg|V*$[n?=AO}ر"GtI41dh۫mu\ S0W{Tn zN9\9mz'hU%Ƴ8ɤ l㬑]0W(ʀ.zٵʣhB-6z=L=V\U`ksHSpٛ"% I"Yd /K `R8h7IOW?ܴ _/CWpLZhg*ܚG:?iD *ӃkԔ jNeq @tBH+g[襅RZX^9:o=-zc}+FvJB|ot["6 =ԤM,k3v,rz]k[dٻQ"vؤF׷wtPHb,Bk>iګ؃D`8ЀIk8\{VǬ-,`A"CMDe *p>c̘k*1[fS4,D)Ɍ1mR@A$DNaZ[U C1^oKPbj=Y1(F(i;kazG11WB'̬+*gS Ҹ AnmRN|$bU.Β2Ӎ0:n6؋ːYBl!3Z=XGO̓7*'?k)k+Rt5ese:\2I$aQV&F=9Jy8|5J(59qmݢl$W]o-yb :<ļ6'?Z9cQ`Rf)ZqU%cJЎ E(ZT뫖أi"_,с Vn~>b?W©@:WiB'EXiVlۇ)CT{eӚqίWӶD7xҊ=omҪ? ?}K kԍnKnLXIh޼SVr q )b jiߔw݁;:( 3QQNlFB1keTs֦|AjRƫF'(: Ew6J?:dWk٩@-krvY¨wӫv@A|wS:4M6bd0fĹJ|#wn픢Q F!x }z q40!>M7w莍E.%6f:Ol9Y<6tϘjzBFt2+R ;(N\>kNP.I_׸i"n"VmzjƸm'y 4QdT+Omc{W6 /JGC흘vx2^lvwfX #2**K%-&;^|,T)[T)T!K|l_!PݢR U4Ra*jڛpUb]'*>6!FwɝlBQhZjbbfLIH54؊4HcAI 1V709Tu"AM=u7[$1a9vl\;(Vݑ]A=b>!T%mr+mݜ64S\$ZOɬmĭU);*5xy:A^Iv CY:nHεwHK"S3\ fZ%"uṾ;0^+ V;c4g{Vh#sf%o1TZ@)#z}'%.lW R\Lpk[Bt0D<&,OTՇkd$L0^O +7J=2mN/gƱ)2FR 6~h GQ<:kMvZ~j m @YHI8ݺ)$Rʡ19-5eT;1c:Fx'Qme۹ %}d~,3DfmFD!5[ \IeD>p鶓F`zs=T󝭧Zoߦ4b-> O2ےIGuJUAC .Ex=߬uigE+`|uK˛ptbۈ4Uo~!,F|> qXk$=w~Z7!_j;6:Uaר,\3UTJ]5 }jKQGe/|$c@t&ϩ+ȑ%c7kD QcsnUaħћX41LJO]eο}7e|:*08=V?fx"$dh"f8jw l=JߡcJPQ! UW1b};\q"Ŏgok+S!`Z^_ 'd-a6)yHa{%<[\WF7/UwA Aw: gA02ZGܓT ܼ$g>CMJ%YhZ&sB%( lfipWo|k9uXp3].=F &IW bND8~EܕJ_^fwey_K}ч2?z p2%|,gC塿NGLPbFUyc2&Bw=0;WL)mMrp'K*&-\[.Iϥ+ (IOg"NPWDlBTFA6,\ED iBJUPpK [# 0cdD-N+h*DĘ~3~=WK2V`b`Fޔj]mN.Wa &r N1Zur/HJ!5Vac4KMBި7\~k!JF6*]C oMJ:$D*nY)Vd{p\4]/[V9=ZQP[EDQc|/_>ϷmW}_G0t'mhBlKvC\ks7B)2D*,"f{Ch dX+E)ϭ-`Ҋ*4dr;0k|vNJSgMm535H|:ޣ#i觑h+1^s``uI/nn.+ 3i龬7SeB8P<-~OHk uAk(v<4}Zt&hĝCԗ$G7Aކ4t$vNW1h>=+c='ץcJR)GOgѥ ݵs*0Zk̴7omPI=n@̥e 1jH=T?ybLM<\B0}IWXLh\DAέ͹8[o޵}?3B!׸;VO^5A߻jc4]K.xM9v1tjrxM *}& ^yF%th[6kO]Sl>jay~D^Η,o׉ ?G!4qcz0AsmLy.xpe;!bo7SLncoL|})ZIophäc`J`8QQI87W: myLs/;?$uH#b-*eMYPso~!,F|> gFʜRvh :h-7T)ٷ*\/(Ff@܍sg idTԌ+]$uUwo#!tӽ@]ziUWyg8pVhBe "'n2hxlJ)LEbHLn9wr HIItN9 +B&Z9ۛ擲dǟ)<db?DTU-;]U׸ 7^L[؆|y_lSӎg:Ԁp񊎍uQ>d)}Ҁ/ZPZPc"ݷ⟔N*zF03*WhqIJUsttB2l5R2& -'x$xᙗ0lNh2ekbs%r"'By+dYsCܻӱh QRO.N ~uZ#J9 8orgÐ] *c*|2{4]&^|g{9ta$sxpְ#sDr_hQPùdP$:yqu}gRu @ˎ@=Z ~؆#ƒ™i] 3^ťQ_c(5q ) uԔrFe- ̅^ ʱLik)~ǟMLez;o1#>cĘBUe9ٍچcE9t6 =4j(oX] tPn%R *f5m:fQ3 G5-MkjtRA/V|y:J9_T8kFBCK˥ Fq+ x]9w Z<dSs[Z#[  SJʛȎ\cH(9R&"^&T9yxǸ6%fcg!VQV{x$ w)>)۾#i,ى;W*1$.2TI+vdhD|ǩ(WB9.YRFahEբDg=2MS`y0 sdJQH) ,H!Ll`ǛS([v;i4`tSd?wf<̩wfX4B8msBE(o XظDg3qwJwqKL"BkPyƂݐkk,]_*{< 5'~8eMҲ(zP.q,nwa$$1bCG:t_4C-:H3C(Qm~ }kcSUT3QEnFK32e''!KKI4ƊxLP@%2l/}a gh!&I)!w49{>o#0ێS0YInlQH*<"Жϙ8:V~)w?͜'!8>a]HQxu^PLä1e|;=}(ǭ,cͨoMUT3Ɯ7X1EP I[: .i[/VA9bG(iBj{TNyUۍ%RUhӏ_HC'R,J i r II ƘiN6,1VRhYUI O( |B]3* /D[r-Fr?x_RJQ(5"5M˽] )E t+C 8TV:Ӌs}EW!&DhZP۱U*+B͉& W`)W}YF1>z'8\ZLvd\!ކ}vM$0?1qvx]Ū<)F9B%2NumRҶ-1ErhkbO\o5x~޷gag2_)u嚃RWh%i uv$ݣ-CB`!<~?4Z DtP s'V9FciP$!rf҂|d =R)FFNG6Ίd)ћ~{ teh8PN%(߇)lU?~\C;ӎ_ Z !WDӶIQHNx T binɅ@81uh5ysXV|Vܸ[-\)/&^+TCU&݄įԿXw#u Z7jm2V?GO3t)PZp8czirWF΅XֹGCM2%A Ղ-""EVGBA4)HY  (qʯQq8XM8z>B;Wb)H4^sj֮ |d2(g& 3IRrKy3\.,yKG['D<+j<県#K>uԉg‡9{0K=G{پ7;ӯZQJ!9Q{Hێjc(S˳{!JOX>qQ"*!G/wM+ԑ ɶ.MҢʟ80.DېAKkx)3xϛHds`˹L#Jg,aԲ0=Պ~FZ}s[wix;?dT:nq$\ڦBAHc9Iҝ+!ax!^ᓺO`18d=ɗG2l{b,K.yM4TfclJ]DQw7טӻpxY8P0z ˉ] [J(ߌ@,tz^ SD򶳽\N:()EC.RwN)0ſIN朼u(n$(%)H߳BIye%jnkh7pef\`^0 NM:"@B™X_|u5EEQ-v "p{l2,BK5>nMV{<>5HÅ/JO[OwZ 1wV|#"#s>gx.%Tk Tc4N15s]`^ 0ohDjm qB-uHxǔ`. SgŭYJh[R֤Vߥ<91D8 ǁcZA\DZۄe@r VqE٩Up8ii@*.>.Gyi ƕ@;K BpDȡtdW̖QPœAk% {;-:B(TX\\T-P LOo@r:vxeLl6ODO7KxS.z Yڳf)1AzpCƒA"0 F)Q TXưbj9И)Iejs\{Vqy;=d5j]wb٧q!BםڭKs^E^ç=ڽ3G"OCc!}F_Ln\sB e ?>+J/eџ$f?ręJ _#L\r`G,^ * 2I4X)\sqg).{KKXlZGYv *ԺylUVA;@:Sֈo2L?Ƞ(> Ml ,+4rl⡵qEw?eSwG]eY?=zOFT}mt""A8YpL_ם`"TYiXxQﮦdlg%fʵޟoJޠynM*Đ&Pœ1N9F s$Hm[pPR"NpźXuV 膩F "Ejms $AJsq݁jqqbs~ =>FFNy9"$a=T>+2[tGfj4 0Jpz"=@kn2y_z ]itLԫ=kuv^ԃzS \Տ%EC}TN?⦻z,9\Y4 l.?3W F!VBsœ'z(TWC$-9nQP b]rΈ[G0SbIz;Kƥڈj]=_EbZ:C=N-l +.u2ȅ@RT :rQ/`ee?,[z,@^瀆z<,@Nzj±D%%8F)&%IS fbHU$0R4z|O^~@n{ ?6Xg7X ԆpMfC @A y@zeYI,9B@`?A`PJø:dE }kܤȬ7R΢\H,[dZeR181 26hS چ.>tYeRyAsoc% (_$$3lSR?{OFzJ0ه큍vLnTAFIxYUEʫFödqe.)N rl0r)ˍZ  bg-"c@v ut̤d(إ'ȻytaOō7HyH E!v=X#|5)rnEe6 +,ywc*(y 6/VX0/SzS_jߙdm" @:z R|{;w'0H5w>qڍW9-x+䏯ruEms\iVU? +o ǂ@ …Ƴa"Bc:%dfVKAe==ؼ [/b`ѧI'?Uʨ$TxkCj ?Q28` aJZENBU͎2槑 +^XdkJI.v|ehe [(n>v&D4ߖY QmNU=ʯVK/kpGU^mk)H4Nň!4xP5r@ƖJgXפJx{{{`aB^B(LnD]#Q(IdG/VuHaZ"ey]DEPx:Wqm2[痈"Dg)OF29i3(8Ev`Ct+,f(vV"B(?VLBYў@gVUde>3kTN+Jeh3`rUe9N qNi Z( Th &˳}(vQsHs+O;2U7T7#\J`78{- "ӑD%R[.wޖQ^i5Fls)8vDA(|y[@9 T93XDrjo+D࿗,I:gJ׀Ü*Rf"KCr0 X%Ģ5-,ʱB p. fu}K/'8tr`7ᷕI-E`#HpZX 8T< ÌDuTNw k7O)0'܏~q4!A.;Q;YkD5k4fjiبwkZ7?ݏzqM0퍶_X( 8kSvT m|ᩀ7:޶hڎHHsTi;jx*5 w9(?9S9(*u9$1۾;W 㬊G >vgm5B)+u:Ga+PquVV(|(QIJ¨j{k >gJ^mG9 I(+Pf3Q(]  ]*s}J ua0ѽ_8FeZ枃h.GI0]] ڿ1lS366|%=`M-6;<I&c6@3t Kk˷^#Hŀ 5bhl4dyoBT_Owi;77sc"'Z~ɬ8q ;K2~Be.2Tד5P Et_L[Z^=˰&͞ɏ=#!Nd(lb:GaT5 c5 ì􎆺Mtrtq6 Q1a!+ų058m)g Vc:ѽ38FU'8xaʭO 2Abi@BțyCrA>YuO$<3$|Aw4n O*O!NR8JG>N)\,w |Y^߀;r^k;UJ瞧5{$h^*;L.UCpUkGMi$\͑ݻc2w=q$+_ĻeCbOn}\qc44 b;p᝻[ aGd1>MO]T(DE\陭ܙy iȩ7CI]6O1%p"Nt=(6PT\7b7D).P7*>^7J#~Nq=(ML|Fi@@{%ZQq CLe'=tVWXT Q':0Bh2ZWh(%Tlt `{jaRH>۹@$S?V1v*d;+S; W'\QuIm_^p–#?v*~ 0M?/?:bgӻ4-./.V\>=KM7*5P/u?]&lXR!U}û~E`k{qK%a|o?l俅Zb0ljb8d5/wUP]DVg̲<\뢠@l뼢},Pu{F洈rg*U5 -xqǫ @,0vmޔ}8|hG',%*αnQ}6~ e##uE+bub,7@I.W(nW,s %ߠAIWapxT!-0B-r8rcCPz帠!r=/?/e5;e ]03RI<Е;'Yc\䏯2ْn\^]ЧR*^Tt[~_glY-?ªD5I im&qfBpL0TSd lQTlSsf,nitA*|SRVv_\wÝu#9 ic^*>1rr8B UЅR5/ V)`~ SvM:05($?ڎg|36)7%X4dc5E2 '@`] =d&婧K8A}7)ΩiFo<"$c$.(sg\hYj-vt9.8ZΤ0\zj@sģ$ O>]D,YȊ=l$]$ǛY@@ݨ}?TTJ(c y'ގ Jw 7닫/fv5Wɣ*\־h~~7Y0UˇL+>φ9BLrly5M"IAJVS<'+xm * ƹ laT|GG.xˡ9CB4E3`ײyf'ȩ\ CXA ]`\c!qS9m֑֗I-hwcMM`MY ,tH:t{'v&Rlb0>9>lxAA3\ydqs5`Š\ ֫ܰh6PH]S!o^CqcW=~0:lf.7~n]}""9\rJMDX \}[1輖Ѓpf#p =.8v p#8&N!ۃ^nB-a# j! jZ'Y>D]<^fOwh~떀 b4J|lJ^)9Ey!ܻp2 i:H=w\9P aLQ*o] Z.e Ms< E ΦTs&AqHf9P` p| -~9Xu|$Yr+ſH`Z]˶{ bŐ3OhG݃sP*R{OJX-Ƞ9&[В1cTgk)5XN^w\(no?JF[ٛl "Z njQby Y. ?Bَv>FF%ſIqpI {m!s!z! dfd;},4_5-jOko&j#}|M|3#0Wclsv3"Ʒ~OngWy ?;X&saf9,*2Bqܪ3A3̽F-AJ`Iku/*Қ,H2i<&,2VbpNN(u%BϮ ~C2jj:6qȥ5ر1n ̌n qSpTݸ);TR =T(hI_cH^IJC˕,:p* Еse<͆`C!6T) "4XYRRmWZ[rH&$[ZɄt$}EBJh! z5;'8ٖ&-L\HsK ЂvMIVgδp +by+ x cLkz*r嬴K7HBp&-OO!od9cC0*/y)Z sX z>dE⟷߿{?HϿ=T{\Ճd]K3I]x໧_Aʿ!kQK>L?D[Ԉ&C ^i]&TڕqxI?FFni2J9C/zT&Zyx^z*TYPy9,7dZLnx~M{d [i&~B~v7d2}[8 Mf+'5_ݺo(<(S+LzO|:5BjJM/ [MFzo%V l./- lf }<=rTOh^ӯՙ_#<ܔ_c,ՀbknS0rџWڕ|o'$䜾* PtF(ܢNȊsP$K$>0YT&F7ABOtj֢`Ys>,p>Zx[H[ Sn?aMn9 ]tr=0JQI>R!_wt9zb1W #:b]?R]՝zCdlǫ쎩Kr 9kXV {P(ڴfn"svfmum\d6ѨkT2^,̡q0ɶHz,`,h : opBA  \،sx2x6f)C9K1 *ɵHҔ>/RO%lX,-0SYN/*1Ғ)N2򰥠Im,Ce3ΆE,Ь:bEdT%29*2<_|HI#ͼ~Bw6O2<]zPp=deJ^"r` ֞\d/\ƾ/VٽĐ:.F(;s&_(v"yFr#kcŤ%AJg:"㫪ـ{yb"oGiIp]Mzn֮^uS&[q,Ǫ,qb~K8ͥqJ\>kzm]:1sXa&UjP]i}w=uKuzx[_O弜qu/zh{ٽ%(=w$MV]w^'a(/9Ѯ(}k4hLFKk!j]#Ǖzpl6cmيqҪ:O];[hmg%S#\[Mj3ϓ|g64ő;eQF@fsw7I2@`cԟs( 5(Xo8ӯ $[pƌQr=9XW/+IKH&(z3b~׎ -N۾`W>fxG7*cH  a3H^vmKQv6WOdMnkܱ`G\>ޜ[d*a4Ch (TD`N1DL dvh}PqCejWj1s y{{族#= CP'(~ ଔv: I8N Xdq?|w,K? ilH*'Q`q{Of2ffy! ;&xoroկV7VkYMOlnNu2jn݉lŵH(D0\@g,!*|es9޳8JSۘy=9;eCegvF^uӂaT 5XǷ`ㆅ>_qc^&6 sS|f/>K*2+E[U i6F΅q JjӇ>#wC阾( KggηӃQo@ ?[=!':o,\ͨBJXuw/kOB0 )\R3b4Lw [CBX+7Z١ b&D 'P8^ n,~sCcmo'pu<Æ>h4.~9{7 g@H ~W ܹ!Id&L1bd'1P B$7Im*_hйF{ Y\`FMqR}' 'msBve9,6ѻYƚiEDF쥧zf&klSNot-1R?\۔-ɝʗ%|tz5B*m ):XA$I:F=i'Dm`A9cDiiӌd&W1p_>oL-7 {"Ih;T$vǂ 'L\FeZVXSFV ,S:MufNɒ!Xҹ[5 2H[|; ?9FPXt(eV"cR{ǽD)) ΎBd\i4B"8c &.ɟ gzo0 /}7yZ哕xv&O#mԶT9η*u:"iRەzsEJ )7)uR5:>ͦ5Ҩ:'b,YRW/ٻqP$qE76`Mvu8n\}M(]+9t2H>h[5vʰLfQXBsI `&2J:+4Y&!FˀwXwӠ4d4Xo)i۸T[$y6t߼mzunvKW+TӻG-%g֒]VT!Զg͛6[DD2H-:m kHSF Myp<,oڻlj!wP֖hwPvjd5?v7ċvڲMIϞ&ԨkCֈ paMhG`=fلrAKɘF#GX$g W4 s9E؁)c QF1ĴWp"d`"E+^hai ӑG"/bB*2C:Sޣ qnc@6P) D X(iVWCDa`"r+&;@[Lm- 2;@0Vd=f}@hǚAA1J)@lHeK)h>hi9x -CqruFi$/ߚP;FxomT\8䌎at;yu{}4y||,}A1X|O>$~oU0Mu%=tqˏgh@B0v#tZtD%Bg!OԠ`J+d4l|Qpbi!SR~5ZoթN6Qk/^ay%bDqӄ^괼K \նF$uC5ѹ<ԩGJf:2hjTuaէe,:@ zѫ<`Gk'fc$R:fs0&=X3 $|544.F$6{RHUMփ_p9{y +*YRbpf^,JT)K*ta*aަ/i˼B2W"hoo\xO kQu ]u ]Wt桐 NWZǨ%)lÐ Hj0q$XJh0/PjF/i6J^ ގF`oě{cEWSی0 QP͙o&;K|/YAJ$߸2HI qKAN6Z2fW Rʓ3[wn6/50ZI ."ϊ\}m 3ˊ.N Щ_ ">1`5|zJ/P$!p:UY,TrDy:D*"CdFX(5 9ٶHj߸$+hG7wSGwl%1pv]L(B+J% Rl F$xzD9F ]loݠIpԲ 3EM@Ry~tth1 Q=&_Bٹ9LF5vyg'Xs wߓ==G.@9OғӔNm);_͂X,죓)o9D_ШON>zqnL`y4Noɘ!ydEv &6 NJa/#3 Z+,yKOִ""7}.ۭΓma0׵7v W﷥"l.sbayO'+@y;h/kI1؍n'R$.f\[mF  fx;JffJ=pߠ޾7k}t<يg^'<1/ J xQ:'i$GifQ̟9 FRҷGR7Xo8;s&'pWy1*=r}g99QDmi|FW׼㘵A>4G8ۊht0ӇiS)bR t;iZi_/{6cU{ DBOrKv^ypQ6꬇m+lڱax;sK ۟euP>)FOҵ,ytܫdx7#kz26qn9ʙRxݍ/z2QI%V{7 = e #`FpzD֐7XDջ L\1&NHa *6YD!;TV]Ґ-0׸8glKl5AJOI+Dzey-fŋϹ>A97mK z)o/)gWz.B#}: ?֯KDϕȶ8*Z5D8׋r?߽~v(;ToP.ҊJ=p]m]n,/3e,QF(4xK9'!O&S]Nfu2f=eK@Ȣ("WI4S"Z"UO$v"Md[=/YrJADž} 䌰@Jve8Yj[V3ө+9# wSQyu1z)'5ITK^u ֊KN5 Tq`ŔDb zDA &wecNԏA XΗxI((``r`( |v@od>d {AR Ylr7(hu@HZFNW-#y^`%g\ DM`v`a/%j}t0 ":G`{&hQϙeݠ#x#q 30<v&=wgsOnsb;槸9cTv_rY% \cb2d];Qc]i5;7JʂzScDZ>EOD DR M{vܛ͒{^[GĦ)zsQ1~3;S{ޙYOF'8ʔꖱ554{QJ>.nWUTj]cL4]nߏثo+8ExQd gNCLx&&|Rd\ lm(B2jf]+R[9NRKg;N`Z 6L;ӧ[5XQ0JQ+'ȕ'qMFJG[ub0~r#XhQ C`<옷ז-2J:"V-DPNۈL2_`J x`rz"!N)L$aE2l*(BH`-!U'T@.!uWX(N=B'DW wMUO7F|'>wK=ȟݹR-bџ^:[$Xy_h^!g{h>o qil )X?`!DTNڔȪҺ2 PZPml S5@jBN.c 4nh0 c)$Pɨ1:0j/_Z5UK_z5p\OFĬ2$J+N==OS}B}?!V>L$Z1^rՏ7wtQ \={N EjB@qkƓtκ B\"iWǖ9"Ѥ~Eaq𤓯 q+}M6(mѦr _.+ ǒ(km#9 /@Fwuv]%@觭DegI!gMص%rWrHNr\v$3Ԁ.jc܂BFD 4@(؁ ;AQ5<+lZNS8&VkkijS%grN Ո: !Fp6TQq')*EkLH !T'J +2xHuơ}, w ͂c`|J8b# =!nݷ^|xo.fE췓 .sч::?Z^rPW/_Vc;g`= 8j@~|ۂcybLR± y#GwHum4^XV\df3m7@ee\2/%mZ 51i \U(@cƚvyfF#&Jlfmۚl𷴙j4[S, uFˍ-EѨOݚet 5͓VM2ν']{MA8QG ]mݠ3# R':t~j%@Ao"KKMU>*Rȥv!$AbIgY|vDQ\1Faȶr=±VWN5@XG꫒H톖(>H]zC#~D*vnZ8{(0ʘ ᑂR:8l$}|otFx,0sY{vv=ߢr%CI*5ho6w!y I1Lk5}js Bg$X(iS# 2QzT:3$t[,KD-,E"$I8xyIUgg1B&Ls7$xB@E!idT^S*F&}\;%*) [u14 'RPZ wQ:^T2#lΤ)ӑ2T5Um96#7⃓ۛpq۵N?~}yy^.~ E/k//|y~& 'R}NH.:A[5-yϻ!du `&gTQQff۟(]n-:p\J*l>cT)`ȧE;#ÁlԫShi%sٕtkC5NQ12BC8wr!@ӹ(RrDh$V$6 DSbQ w7ǞtnղBPðУ(Ü ǧm 3t=0mFYAunbgU"}V%j}|%{_z Vr3*!6w+w =^C%EXoXp!}MXޤ9\}IsFFOM]8jr9^ g&x8Lf71;lM㱫&N|ԩo<.ׁIqvݏ%ڇ%vT=zUh[Uׄ)sǸbP8h3}4evMJ {k1}UϘ x<Jxg 1 (-9'皸\Y* HWԺA=u-C٩>`g9{rVbiBB֚Dʑ8)'n,5>DX⼥Q1=J#)ɨ*ojд h0}3=5SeF2es2%OT)SXHgBQK*Gv ZFϤnm`Oon~q+Ubewԁh>s!(jOe' WAּx)~zɛ>xwwzҽ lՅ~]U<༒#z:MzZnフP_zZFE}[-(Hmnp)p>ԾMCZEYLDgoW9߄?N2[ЇAE*#tⰁ+ɛ{Cnsi< L-.'G_|HC8JMbHEB zǾ>FF%w&-,ڲ?ۋs^_W~;gVH>By>;I#Q7L;Ո5pVM"kmƑ9ۓ=2g^LE|Y}QĴ.[]_>$xS/kA5"scb֛k<0غUZ~Ggjs ZUĽZ1.wm-Z!oBT8 Ր0fkkTJײ[;hNV?R>\ɼ "Fޚ VCzXeSűeptMfJN0Ȣhkn[مۅ⯷JS8DZpH)كmMjV8Ʀ*)xyI=sd ̧S  NySˬ -^ğ!QR[_bwRQ6C6o8gYBH8pRXS&jG̫S.w;7O. 2Q|jSKf@.x`к3sJX˜~OB3c(xy9vvp)I&/_ʀ<OZ j 1R3x~nB\8Ń)뫂*ޣ{Հ_V,Tzj=1*~_0oLv Rp1A[!MhQȶ h۶R6GE]s̝4 v"ZwJtwđu'5S#tS'..h""saoJh=9j&8 S%7%5F(J cC-֕.(@P챛D}5@\3 ٳ6!T1+ k<(q8dٵ=| (ŠxyئXMB'ﱆO|ʂ;[۷?qeE\'e`dq5O iݣ6l} u{0Ai+ۙ?vMZ#|7EcK2(lohx:S>|o}|U(clP[b:ՓȦ:[p R:$5=Կ 41C}&).;AxK`tAҦ.)X٘# E#x gg)r)9WAn)zxكS&#I;oڼD R\%80b xg>]\x8/6K_h6HhDRc)KΈf>i1dL;7(`XF8ՔZ@F 0RPPU]|wYёÕG@}VQcМ=ԕ\W~umuӜpJ}1IUM;{* B :R(&t.Tn,\SMSƳb.P?wW-m.>0޾.tWD&Cg0) LG!: FoP"15*-y[@+"B({|׿We`EV=^ y6 y7钋.AqpA/\J:m`4B&q4|J emJ3R ; BgP|+b0J,@ suqiZ 9I]W1O,HE&x";F7,7$7HhmNy0vHC $ ߲))L A yc}@I!B x\E?Z׭F4$=ø+=Er-FOi͢Iዐpю$y7;`I VVfQF95a#wUD+`̗}r܏fx@ w P%k5F %#Ö0ot@u2#$1Mb$@Wk3\f7[R%PljE@%JS~bjDDIqU++35P|Mݱ@wm=nrEĈl3rtןj݆#R9:p﫮ff/2Wdky4JZefW&=|X\=L`q-,8]ާ. UEgOJ2 h`$vE*DkHiJn$lm<u7d%G_t{s՛~2p‡3x {Ӽs`/a4Nn`{όߛg~{Ly#`l kAkf"#6u8WwP*x)[ 8$ KQFT41Tjo, {ӫ_tl/g0P˸qV[؛$MT)HLtlx9J3ג"QT+4; F1ؾҰryWv *'rrq<ӹќ0Y+ = G,`X<2f9?Of;\͞^5OoX܆a"[s6Y~:LIqHp3Xp),Œÿ2Ja<*o0׹^0'z;-a`2|0IR4]L`8t ӿdҷ9/}kKKD`f֜bQD\ccTH@j0k-R*\A%u'Cry )0}a͜(ٮ=Ejjs)al'+A4Ư0c*pF52IHJkS$a#f:̘ϴ "z۝k dYeϸ$Q8eM4"p2ARBER(h^ u3zz>DhslSL?aA eRXԐI(q(uD0K(|P2eVY(’L6#o2lMΦEԽ>g~LV𝚁mI+s2גLjvDe,gi`lpjBB!: ;gA^p&plEp%8sr!PpJ[Um$D1 q¤L)fS $!8BNխIyЙ@U$Q3`$7pz _0(]fRBtrB‹ ;`XeSװݤBh}x0:KǶHC  J)DE/5 (S74W-9j\DT)NV2%Lga#Yp2 elPPmY1۞ fU9g;+El(#0[qsnT5 M_=ILIz}2j"Y1'/=5vL̒Y2,!6z0ӛՆv{yU'˙l$HyH@`lVREm> Uro:V*1!)ST` :$wq #hr,D eL0PΥ\ aZ|bذyy!vnZ^{\*<዇}w0jA/~Iڌ'Irg"QLD%KH%078cӠ[}=6(ǥ; `DG}=)Vkv7z~D)z1=hg+xW (^vp~b&d9 G8xuo';L5X?SbO>%܀%{0C,C!3oKcNӷf4 YÊn 3\)C\Lfōy.[Jsΰc/ LGSO௔tmq.A#x|=Rt\ζh5S2)nqߥs5 WPTq^GgRZ?0Q -:t\-Q}sϛB닼 ݮUQ/eAm EPbɕ:["ԽF_KsSVHH|vj7+]85$fwuuѤzJq|ߤP&vW-%B)}҈n ]gQ#LO6u!%,:g#Fr/9,h$Kny%v88 ѝp׿„=8 K M^I$n1>Dh{2C-3$wN|!ݵ³WRo v_Cѽ {]3)WK Qܩ>_W2P8@H( p# v{QR)4 4(Ɩ/ύW q4Hӟ6l}ÿ%8PAq_5#<&Z 0<24ů-6(yXi}hzwP[ vdʾN ` -O}~\ޟO%襅`5j/N.VlרS]sxPjR/ѧrD{_R;x 3E˯%=&ӝ)\~= IU.]Q<%:h8{!KOj wSً]^SӀkFG].r;*ͷ핊'f*kÚi"EKKnTe{T*?0B:SMpCՕCZɾ4vq&8QW)z3xRwǾxյ^>!+yC/7ʵiYR;=_VK4z;!)/<.;c ()%~&J/߭J()-Z%q.tG]M$2*B<C(Y1M1_L_|ŇNރzʗD->.h͕c~_wj茗lTh[fP7zOQeC0n歽'2ֵV+r_[*Y[&>זI9q/Npǭ"ùZF?>3Y@>+ZEJA,x$Uh sKo(I Gӻ~%I?Tg1ly:5嘄'i޻ ΑX؉)D]_`wq |go3g|^.+7[_!Ha}۴5rVXg*ҲS@KVx@^㳶}!]q_]N. <0$[ .@;oxt >|yWckmH`v/' =d}X[J' TSLɒIL0cbWU]]U]]E>JsW)]cQt >hEۮrDC\"FN*/M T"-v>>n4 b Us2Lܛ̂'WwIhW.K$A}G]v0b`[ZɭlR'nnon& S)ѤsԃjQ lg[;n >B/&B*mf̬ ۀB")J_Np?q\MtѷNxd>.1rB?я_n< ]a q^k+NP"6Pxx̃aZ  R*9ssؘ{h]l܉&@& h&HT{ߔR'EaLXOè^`Kh{*L(ha3 RgE)A$11!r1rdK%pNA;7)PľlZ,R k^ lV!T SxY4AOhZi\zW1xL1=L"+2Lّ ƛ! n7{0-LO ~Ex\ȍ$pD (0Y1·И`e0($W Ǭ`!?~L 5I:%AT F+Uj'kbᅥB#u@O4!/,FanD(/kFv^U| |G(ݕۋXQf\ Eq݃o]әF0R"xdԇ?|Dr\G+g9,YW0Qӏ_N'3j0xv!tZ~ʌ'@n8#hHJ쯘v 1qr}6w>sW,FD%ܵTٞRXVXT9'hn2H\bgtí*+,QpPdM!p qkPMцјkID( r7W/6( W؀9W %K/%qŘ-/+$-hK7!؃k kBt kbܼ"ƾKHip該T[a95ȈlZ ^dD1BZ/%-JT^|x,/?j5,Xy+~2x_#㫁 s K4 IEX?Ot5|Lzt>t8Nl&w!I9f6 A?&7|cvՖ"N .uE'#:}k 9KIB*wN /iE<-ª) JcHvJ$ydPdtFqeӷ}3?;xGw}-?W=un"9^hi{Eh-:$zk6x̚&)ʘwkH|7Z4FDX4xǚpM/Ui(sf0-h05nNj=!GppE(:F4:Gtw;}΢2J#[K]Yh`p * ENXSb;NaN+E AAq& cT`2O 1T(V`M"xS[O|ܨP|s}Sv|wؿY\w{]wo)X8^`܀$FhiR!<$0xρ (+ Sx3$L`bNmv3,كJJ:ˬ(3OrNfcm1GZrE`k 4 >} m^6݅ʹb*_#-xn(0+$a č7LfVonG76|uE/f3/h\f[ɛ18wZh8QRq^8 KhbS0[,ոw8x)+@b7)Z2(Fw s1o1?ty}  3C$8t=Dd 9=9iLsEMJ=e5XK;+` `"H؂T/qFk9Z8N5 SUٺ 6syĉc3ի{iFA+SOamk ;evG|ɮW2+= :}:rq~N6ְ&LV*u`f{=3e$KbYyuؙ`/.X:NV!9gf +^J'ff `- ϊeJxXO^P%|p,`\[z8x 1D! v4}W[B H Nh0گG6}=bk=*X;)*ź,fbJs 㻂ђ5ԅ_ZwauұBJ.tQ3M\W[~>= D;SQiZr/'EE/դOo+,>BVH 7 }8abHkCfDK+mY[PpVF)uF= n $'*}vWakh҄j^r.ޞI66^IzoMzƞ 61jX<8\P{ֹ jKo [7V lZ.TS;.{Uc |Gyʵ/؇[Ӊ>Nr[*ICT{vUͣ/dŸ-=4ccURkуF \.k Ѩ?S>vL*HY:e?lO [Pܩ&ANj׃IǻWbO:"8}2C4ݺixÇ 4X0v S&0)f<{9gMEҝ3S wmJ9更~IbmѴ{8-]T[*9E;|uq?4LICrf@kjZrJDZ524uQ6z3G!:N)9[V4=Q'&LaRwNL8Ǖ"&Yip>>LH^M.nA':ĉqnѬ̢Iyӿ~UsFCqu9L݅vPS^^2j[*sjhMݟs 0IR[?(dC9^vz,ʀ;jg`MN(pL^ƄZRe|M80A= ;=<2W=k)y8geVbuM_DFu衉_y-'ہ9YWvRns$I큉HsBRQTU2P&R2qQ IɽIL¤QU HEyIF~|=d cNGR㣩RU FR]E;Q**8a6JY1?GC$7Y!\~l]v Ҍ̱.{ԾNZA4췆Yu $#HL0buH b6evv#bekQbN㤘F.>S=b"hp8Ù$kR=bJ Z9esG̎9vo"e%H49sߜ,2apQS9 W%}ºW5qęi_IN(k"4;߄:mkB4σ?bv]Ci] LXMT&[b9OpK8CT߮ZO6U6Is͗t|k52RQUĽ7 BMTW**kMNo/S J6B䒀݇ %s|n0|5$$gܖ%X:Uo)"d{x=PeTڲ(TFz(T(ULo*bk&u-BjbfV(.WCiN9i\# a,%`v`3YzNk!h yLʹ'dg~ 9;w].j{ I~hٵ`.\֋_EΆwjኑݮ1g>ףQ yAVgGEg n+Y0$7q2"8f 14(BWoP؂QN읇5@+ .N{Ё>p g3?,YDPDOa8‰ UR\[8N$p/]cs#W`J 3DEM}S 'x]*NDL4⩙@RIy,atOwJ5@ֶ:xL[8rȓbv\1"2MG0+N@z05)_uAv~ftJ_N{gMG;g=D䌭F5`H.wE$E3 5ؿRjWڔ֠3.#`|=:Z[!"?02ϝXcyT~R__r(q175+)cmv*m?+Op$r=5|NfuEsV~|B}7JU`@Pq8'FMs.-2&\^T"m0Ƒcv3ٗ;!閆~H{@XufYI[e!u(2w(vK @t Ӵ>9Ʃo]&O3ο'\hߵbnu?dㆃğ>_DqQ7^gu;7I3( ?|#$ӧʬ2TXeKK+jy2,Uhˏpl&N0ރKQ?$ozŠ_O}g!3vf'P2N%LRƊ`}_Y/k{c! 3)Zm- Lݺ*5`͟)as19* jM6;;Z#9oœ5cvfԘV@_'E>T`ծ@[#Oc(s ˓A ;Ö1ޡ2p"~ x(CL95 .uJgi>|p&e֔*,ڔ g2rާB }.غ1Wx{K,dtJNsjp3}ǓnIZۮJv>i7ۚBUpaCYEf0w1`*ZȶQqؕy|E̗t}鋷]ͱqY!aC!q#K23}ylQ[s='qU=`T1M94옹Tq h-2,8GNigKau\lŐBJp"CJIэ@$*ѨwKUͬԉQEuԊ"{7RR+8XT"ͯiĄml!7Fk̨iCB:PiNfR+'vET\?l "BCǼӅ7nw"d7˺ Rdr0A[ЫDBǒ A8>+l2i[ +\y-'ہqkV5gf?? ʶ~R2*Yi]V[x9F2Qh1r(FYnkS*xu:d̮c^ ӟN=<Vn=~f꟯{.]GYαn4t0є35 o;q])_~_lC]|Cc@-V6 >Uwm+*7&SқI]LT&Dg,JLubi|mDOY!5xy/*P+C nz{7gč!6_~H_g`uQZL E\}!,~z.̜gE=h: 1U3Fu-݃& ݃?X/H 1܃Ԕ Jc):n^/up{X`.$;&2|9@C ul$;YeO_Rk4D}}<6a^qtT.ńH2 sM`nqw!Ό4 Vd+eah|TE냃W\z.Pgqk"Q`~dAףTX41Z<͒$ _-g7<;^_>#B6w tuĻM5Vx4@EÕpx)S ucJ1bSagA1 6հy-E%G@ŬF\.k&'޸FLy~异lmLgx8} (q,PEa6L VaX EMŘ)$u :fٸ0}qJ4RQ3!A2.NRa4pf Y%I)M[d MjbD@ :zˣ*3JC~b$Ͻ{ ~_쀊>cP=e3V3k`RJQT $ZQ(pQ£pŵ՜XkTV) 2\&UYn2qbkt4_ViYK{$L16t+jE#8BQ[I& HMD~r$MQrx%(jnF;e JsU ހVy@w{Rе">QDۇi[mv6ݵrx?MjIٓgrw o0 E4 WOhG c4B^shapmXXO3ކeֽ 0D&ə6o|yYDMpP#TDk4l `ag,EOq[:jV˪ÏX)jw0I 0I rFCY> Fx߁Z&3!*J(RZ_F$x2ZGۇȇv#<2*| ѦPx={O"LZWҌcBƽHVv鰼.o'+-%-%-%-\^O`H8QNhdg^^Yd{eʶWZ+)j# V8qZKen6Q)Y ? ~m^g 15 V_ANsV±t̒B(0-bi'cAJMKjlUjE⒕zچ: J9aX IP)\dD>%p1<)"I>-ma& @1s#c1+!aK X R1.Yh@6@4GgvH2a?Ӕ!VC a]@Wx|-RRk0ZV'Y]pU˽w)MADd,W^3:+?~|wc6_,yC&{,ou_wn|R%\t'h+xAJۿb*-ܐ"z}훮i$49肝3ԧ\UDR8ՂXN2`^c@SŞqE: Tc(1X]^1&R"D5m"pT U ; @',U% p-jKI,? xJ]^3"E;<>P`5 uzEhIсLBlWLjɫFQ{GwKNct7xf:UE[ISڒȝz1[vYͷ쳹}a"&~7I\,~'~|}|$wmB !SCJe:%ch%,6MRI '>A !aQy,a۵ .5% S53;['S|VDP t `49a-SaQ8л3*qN3`,9Je5Ȧ9r+Lh !_@OGiAp4uL7m|@(͠1zc(sܱ|l!']DYq$u<p&jR<)ˋ8FYV:LH>{G yVwdjXk|A)$p7'*"n@{r1޻#9BF dR[ySɹe}u1XmH8ԣ80XNrMլ8%I!T8ړpDM 5̉`LꦜS8'V=>N# ŏ|pyC9͆5{ Zm)*|ێڽ%҂ni& 2PtJ3J҆R_x!6NeRެmCWڶ#)UY%}¬HR }|SZЪ0*u6uS/ɒsct@)&m6+/̦1?S=¯|(N˫_n^ rR23wU4eQ* r:OaW1reQ٣.Q;q,R)t2U3~.''뽭uVuKLlZ,,?D^n.jay 96g/3z,(44^p|ISaUɋ4m0ؼu!8[ĀmxOlL'';p&Qڀ BƆ}xC, ނSacu`43T6γk:|jV7iY@k9zkYtb Jn)=v.4-lzM^sGc?oH!% uwXiO_R5ҚH߅ EOMMMMʂ1F^[M=!r{f`m=\!B* jQR Kkƣ%E4o%>z!6b`?EUxٯ3GPzh<[?h֊ོ~ F3E7їV;No|1 {pѓ͞oj/$NƫJ&M쟷S\L~{]L~Ywݒ(b9sS#-'"hdk8.cK$08Zm5ʱ Z(Yw|9I99O/YSB`Q. 0QB3 EgBN%ȃ$5!zOT@|o( (-Zcxu*DRmv3"H*yb!|*l*`oY_i&g hhMaj,oY$ ]Kł%4|A,vH X*WÄژ>m/(SK5p0'\)z7|ItaN+1rBPPh P擙SqB$p ,ߕ@ӥ6k?]-(e?POQai tήANW8ťX28&(%HÌg:u^[C$ ا0,L4L>?(R&kT N֋{VxJJ+q@}7H f_~Ib!#BqDπP٠ VmNBMjER*eFUGLWFx49XE,Tм@Q{ bݟoX 挵Dnn8)5dW}{sU&zJl#IaiЂ=d|>X΍`VB`aTGFX3ҶɒbjZ֬kX' ̉3lm!!nRbYQC`;EԲFӾkSK&jG<;mfD$LvT#]9maUAN͖hMv7[k.wڢAx\8GF||^*&Ȍdp}iN'V.QV?<Њ4` x+PxCA#Դܟ܉|9WdV%UjKR~["Jɣ["ɐlq8>6 jUKNIsL!Thr.y3X32NY-%[r>R;iѤE4o!@ ~p͟{xu Ui7>Ob4kPqoxpoo] >F짋z?~*SF d&[K"JT$ q%1~6xs d\ K1Kus"1S{H8h'0bX$jF Q,_TK>XDtw%T+D;$[Wi *ma_-vYSp >3xb4Ի0'*"R]wZR]w9:0»']7 pϗ= dлb.0r?| ȩ1Vmxnj,%8҅ +3K,%`n4qWֹkz R bNaq$h5{ 16(*XlWBt]j'%f-F̓sR)̺iD}̓s(k;*Smw*7( T`Z`xz3jxsj́7 Ρf kiaeY. f({Nanx)%yq.\Z ;_ٳ JHɀ<+*m̠x!oq :RXϜ C()#,:["E?y纤TCyp?GXҮF q# S^r #nȀ$FX~"H3G 5QZsFm6t֛<~ qg)BVq-K V]f<~k_[5}m[_rz, A)"уzrF+^#v5,E#eLHC:KH5-0.4,D1~~`[ vN{Q0Ax0"2>c{c_Du@RoKzNr%p[x@v΂I8~;0 9xdd0L <;̝BE>1,Ď$XK"?LJT{_>b(f|;t¾rBUR|^?Mn @>V,u,.^IޟPJb84M䷇[3egN{ DmS&(.%UZ~ww'̐N ,9Ffe{R0XD:(7$Sn9VrQg9lKvmv{%8o7\98Zň" ",S ~cH63&X낧p[kL$I#ʢ?/=FRu93jVC9UT&86[a ~ ־ hCCɊ>)ꦊbD4!;KڔZzUrGxL~"㪛Y, JZ:oy҈:udkg^]ki97KܠxwOZH BZDj2K&+Yz} l gY@kDe 5Y /3 jD@#?<7ejLwOE:֑q{x^?wk}+kp*Qrť;g 6qqD i& 9L^;şjzt$FK}ޛMoR&IQbTwkο]?_,|(70wa*&qġ%rkC8 ke@.ZHt$]oIrW}9䐡U0 {]kܕ(=դ$= ْ35z)s.}#L;HQf+Vȍj3=/s|lF?M!Zy9^8]lK(J[1,zu3s"eU.V p{>m:djXj+^ ϥm]" Pǀ0>o癔_F!woe~p[d{*Se[Eq<S6 K:;Ejb#~弄brw;.̖^ra2^7#uԢ\)1Zi&ȖJh5:W^q2Qˬq"3:ئfr'Sܪ<;<4Ka$c6=pGCo0eO[bcKul4 .8ą>y]%ϋCv^V6mw8Cc` QaȥQH&;ہT@Sr>@jP)M6=Q7ĀWE.ո4Ż 1wTZECS!"ƠZDаG0(fcd x"^:vR Ad0@k%λ֓&Jjn%-9q,-o9%EL,Ah%UP@g51B.wZApP0V*=* &0t`r!0n60zkbh< kM" jH5D٪ 6ڠkkvY1j4kcyy؂<€2puDm/ t#priitavp~.cU WT."au%G#Gt~^pݫl( #UCU>됗VG)8,TfkӮXKEtK7%i,C q4t( _@Jx˝Ee^,ߤC|T3..oYT#7wF; }ў[hIoˋnVI^L_=?OBF^[5! =T\ǩ_ljFzZG׆ǻV&I';nSv=]fHg}7WH Mz3 Ry՚Vy}ڐ/CmWwnm'64qX'CўޫNV]9מߔA^y+gӘ7>1'a"@dϘCmvmDL~kYa4d.؞W_|v|rFx??Y0 ytv17Oob( 7 VM߬0L؂oa+(V\PR۟8iT՟#!Wpp( `OEpѽ=4>{O06[T\¬$gE/$qqsPRE&xKQP:+ (z%S(,Tˁ fPԡٻP@H CSVʀ"8-R1ǂ!h{R6hVPWiΥ^M 3P3i(TV[>rkT)Q?@QbPԉ"T,yRa57jKK1K5v= GCg(j h7FBURC@uPher`,W>>Йadu&X*J/y!rL30%8pk, /[6J&CqVw)b{6-%Kz+pR+0rwZ/~F)6J`9gќZY*/E%2^+S+7,ĽCu*7b֛J)uX'z٘!(2"4bk;E2[g~}XdUNFLWQj#($)4SS hr[nulN2R:8.z*X3D Sr_FKGC&>_{Q:UL})Kip;y ,Wƣb+$ O/kъg&tJg9~<*<2P;j+WqREMHH9.DJJ -׺3!}S2Z_TtI{3 nBj"Hx܎"-b󧖳vz9+zJxW^^1!ɫ {Vۯ;ZS'W-hbGisgʥ3r @QJBO:LQϟb&E9DgwSp% ɻɽ[ gU> z7xL3agwp1WoaǐuoQ-P$+*V.emcG)8vt ,Hc:<`Kw”R:ѩTh׊~P~]jMAlǒ7֭79ղOSUJjʉmM™h놩~4;|0]&~Zq(6P.z[C. Q-:űFS#*V(MihƬ S#aW Ǥ[VQݱb=On Igg+*'ҕYy2C %[[p}+<:`đꭒj&C-!+p1e%D˾:Gj#3~/Ff?+(i(oV*g Ev᠌8)U,%)~ _C^,fXnU7B6e_ڃB>?642 !NtPlu{mqVBlD}W*5.CGJ |*LB3\$NLcu{w’Xp`Ѧ0() Hb$XODd`hź*ˍЪUCQ]tO"de,J;)OfbRsRqIgh7uҨ$`tQ|fKCͥ5YT89d mv90}׳rI .Ҟ ~LEjQpzL`Nr%UӚbǤogZ8Bt_g,lI%)q# ߯zHICjHpfi $tWXb5Wc^-\)BRvcN$(ʬBHhHrB? '9]4^q,k0#-g#"h RT1!G͐c_UܝCkЫ5 ģb!) b r˃Jj!RO`;lmEuRԠY+\(@u*x1LRMޜu iӱIIRyy=?5ip>ADd効COo8O y0 o)BW;3Fhp'hKC{WaFq>Z$3u6cIfT5Tp65ÔKMj %¢AlwQ V =@UqDY$WݝZ$ YWNMRBpds#R))UۜNFXۛ\c YV"*ɐ,c6݊ I B"v([~~Jԇ!}ޕXu\w"WvUŚ>NduV5HL;)C2UdH&Bʘ-LZ+YYKO; mAZ|'hԁ\bzђf)AZp&FQV@D v,[H :>,b5.2$HPDRvbVI\>%OR-2+(&T^p7FG-(62œZQ)q RV (қI]LT&pfX"HG323FcjaЁzăODSno 7VD'| կ GH-rXp={:7Wt\Z57hfʔU2LfBT<fAP}HJ0J#©T[a9u2%!eAk"TD!^&Yq-KOAL>'AsV_&ϱO8ƣyp c{<.aq">?{ħvQnq 0}3.kϹdC~?V+?Qi\f^$!l:!Y,AT)df0f xA0M&h6}M:0o_.F(0%_r }L|InkYYW ibNjЋ [!U \N3c} 7w)(q`xg٨L8eeo4Å)4'\߇y}B퇟?ޓfx8+@GEPJx̒|G3!: T2v'XYʹ uIaBmB%GS.Q]̨!J bhEe82R$,&}ŝQ*m&&,dJ!7$DeVK DZUZ/ҦFK9ĊuaK' 8%/VgIGb䮺GqRLkɶnʼnJ;F@GPI\ܩ$\ 7m&1jE62 Ȗ!i嬳-8 us-E7`=:dPwWugVBo9]y,Z I'EΖa4%i[zcX2Z}];eSs'~|ncI%Z&s b0Tȩd QH1kzw36.AN 9/^ׂ']yӸ|l]a;Ғ q}|׍[}?0w{atA@XKN9|z4ly&aڳ_AZ¸iRUI9dz 7M&VwDp#;'Ð|g )s`Z&Dq+sZw'U3t;V8"Ų#m45GT=F>O<s[=]5E2P4c>j3G=ƘD=V7ݥZ(Q'S ЖyZtY!՞ŗthMpд]Mi"ێ;{6!ċ=?x>{|vD\SKE[ :lnƟkTK°Pv+nl!؞+g+uIQGg̀F8Ü3Gss$x?OBdjlrI69$Sl-0}m۳,ep)[s{V4 t1tJa t:mu٢0%%-y 2chgLc̈>]dR[]Ϙ g*J9>gRVQEPL@0Ra9 6!fYQ੏~9<vTKڍjbi!@  HdIW&QӨޚPϺJ?"Eh K'>_F@?7Cs(烅C^'-QrS]{K %qcuN)v;_ eLdZpmT$Vk'5KNifzgAs&:Hn#|ʱD^ލǣ</WjeS~a:eާB L1h۰Zjwc)DmN&Jh]+:{qi8:`HrQ2#кP4{` JmĚg8g BhX{u;vѪZ^-5DBD (RT[nR`pNDFfph-Ea`M Y`k~|:x.Nzm}ڒ/nt7ߤVY}y(lH عbIa S(9CH؍pPh :]QKpD$S\f|hUAg DK+*+#AN609"%90^5X& ~;fש}Ǻz?wsϵ鞟]5g <-\*F4O#_x`^b-[(DD@ڿ}6>؆tDU>YC2*sg&6ewF >ΆCx>XB[bAb ۽0$a?ߒ% F&:HE! rbIĂ>rM 7O)LQ 8q>j#^!t[xźxt Mݍn֔*nJS\xW+Ն (hXa~D%`8irap7ӒʿumAq6ih,37f0Cxp]u`o]_d.XK:Mv Z*aMTY+%!ІHZc+ d{JIM@PMCȚգHۼaT선2QBieQX l~rDu:L`kA!q 낰Q:BU"Q@1wZ*pffwWbv'q<ǫ2C0ZJTX3IǒFH݅#;eE eKeIM,X yk /P`v1~ACDt|N*zֈ^>1Cz_bƇ+q q ɨg4hlI޶P;ٴ|qF->zbL1 .0{q.n'#R1J"}lCn]dZs'A}bba[ -S@r$S!!*C2i%)YFPl'/5|TQ?7IR$^4cvJp@i@+Gd Hd7{{R7m\oa{M0C;6M&ք hX/Ϟ9 /oX)H/%,vܾ[@F8UHۇw%vnʮTnVl,ɺcT'\+`@'j5bHȅh-"  !ڭ.!Sw*tWE9V{-\֐)!La;nN#È "tk̯fԵ_2 z'n&d<˕fQXg9$sɍ,@oO)eф 0͢iO0>c,"GC˗ E9x2xJo?FsdZ v''RtPl?4;X$jT (Z@f*QWs@f`_KȃJٴw |w22}f |[ B4Ro529KPdJ)xL#~gYp3G#;JQY'\s%&`K!"qp@ O!w9O:tU)Ficch1HSgGDjѸR.a.ڏ']P鍲냘p" nӇXA~y]ӪA J CP=²:@}:VˇGoQ * @uP0GPD"L|0˅UI Z;9S ) Ρds`Ϸ8m ~9ff 5?B0;JGnNtγ/ϼWg3 qs6x5F@ڢT*S.I2u+gC]QSJk&DfjN Pvw.9BH9F >Yqh&Ojơ=kMG!EfKϟ d~k'ؽB=8"eg$h^j!3dNJCc GBqk9q4.oHht -z 3g T`PrƃW50o6ϑ? ӝ6{ZOA\1y3 辈|Ą-LBgg`s9̤7.A& FE P8lHr7!cMT !Me\k:+Wk8W(Lŕ;NjدsH$L~zE0h \])߹yԤs}_{נݎ1|0mJ u.ߋeSf**PnP'217! ܐ/>=w<}Z_OW~Ow#two-Ey/߿C^aSb xNSOƍ4|sbbw :cd(yu7c;-~yzWH0u(q7OO6yˠo1xe!t wimy,9[B̐ NM糚C_wK t'J$NzE#A*VzoԙΞ~ٞ'PM_T'_C~\Za_dHA2oJU@7x< '_; wV*:֣p2)WS$ҩ Gѫym3PmDH҉ n7`\إcmh j( Z8{)1Pb)uV,Fv3'f49؁-g=avG;x]gů재ʙ<˺r%Duĩnd|߸p Of+\@SsCn'kek̃6-Nmyqnm?򑭩O Uv2*q ޖ`Khuy>M=8&#nҩ6 ޗG2ZSl&DJKQD;vTHw4+-#~zm M$a:YKӻ50%I,|@0Q=ĭd˲JvP*4g缟N 'fϾw7d{K]ק5yi~}S` 0q+8wL 9U@h ,Z " utk7i<'E,ܜ\9 ;mJ%VT+Ŝ*15h5RPJx- Nd#` WM|WO?>X玝חQrb$8#-2epi\@EEAe1Et@;X c`KR:ɳ8#/ ;yPJYEYGBɫqmAvhE'$ex`u2"%Xn% BsSy!3,FXk"e\ i BU5dd8HRĝ6Q,u4I6'r CJI.\Rx HR",5} ^]f P# 0H‚>a!*y߬Z VRW z36*)L+Ƕ ]([7,]c!]A6r?%Dce?1F!'䧌} uY΂af9"c6(e;X\E%C^*"1b 7o[4o0i~Xƙ?D|^6nxnTQh_ZAvNj-6'A=S$}[`$_@YU(^%<@lզ>\z<({[phmjSs#Rp5`#DlҺVfߨa`sPՑDpf[%5YE 48e^chk25OLʮ^{ЊyUXNO7-g2$ѳc(RV_]S)IV][A\= # I+[|S-W@ `I݋G-$ybThn>D H-cjNf=8?G~7c90H->!--@;g_ۈ8ʦ b<SI2GTngx.ļ|3ʝ;`4|>St+n77>ѫ^75Oj>v3w &O<*ⰱE/B ȹT`1 wAǠP V0D`&BcibBSz[f-\:`fÐXb8…30)((4p1 F`$Y =k1xɭ ^SNiͧfܙfAss_D X{?h^u|p!#L ebDt8WWC|B.?> ?65Tz4B" ?!F   |@w8g%+ ?G Cߥqq\`k]7|LvG]YLB0v2-nyd5G#uQM\cxpEJɽ|OgoQ?jQR:`q;gg ?} 7iʱ$D e|3}:SVz!6f<}֣Gߠ+}VL1j]c&3媣IGjW`XK(*3]+ʴ<m|6m^yCɈQ=_z{5z|ߙhrrOʪ П9F4B::D(PRBDo8&rJӄ{U)WyRF %cJ"&FD`JN(SL b\0դ`Ao>pի#.XDJI$,dH#"hܢ.X@,fnI+ b;7u1\P%@AILBhEg.@K +a4dj$_^pg`4JZc q.TNw(T* )״蟢,]F"w6]nB_Ʊy;N$ƔVn#AR˲4F8!; NpeyH2K"WPu74}%2^34#~MjWB*B)ʚe2bFTpQ{i{~+dqçW+62yW.rbtLp3wErJ<4gx߾/t]ԫ* ^Z@ %jr@MCo'ߪR3+=+r G6zILkyO&HUR2 qcw!'aXOd{mmKdr!D5DzHT fMxt>.E(\XVXVș`-E7~%5qǻ OHzؖG,ʸ}Fk栍tRf 2n4=lYBǔlVDch.V>4ʒ& gk63tT@Q)9HΗ æx t n>'p\UN2M^ͱN4Ea փDX]m5MWS * -n~;yUO30~s2 t]αgS!Bɫg:Iuş\*Щ&Xuw\ΕNG邤M]061 IQN4׹րLF0'+*Ĥr) )iv^Bz'$q}Kӥ&ݘ7)$8B@w/ $RsbeG]ǹΘ`_B#_N A,1]08aڲUPbJv\IF`ζ'yo, Rh.Ce)QC@"/G*Aڇ*(_%*Ƴ@Մ5Yz_x O'3-l`jE%?@uV9pM=NOZLշ˛鵝~*_TΔGe5HΎQtCjKޅ*w%(WzMq9Rkt?B!% LK/G뇉aӌ5m< rj,ױ[48Rj& H?|8aeki8?o@0Q|epT wrCK.),*5~ --VvIΌ&T2z3 OU4+t0R:JBąwy&Z@WlyhDq-(jKJE3"^hZ/x/ab]ksL^:a\<%X$5DsD2E"!q"bF$+ /1OHJt3OЂK;XzVgoop hn[|aűU/*;sHVPK?_-$r99x-eAa:] Z|v7_c{5%#{#2F^H.Bo0MyI).q2rJQӎ6T#Z#lĂj4xhѵV*w͆d1j[{Kg4յeLG,HɴxvV[EGBa,wW+N{>ɷb 8ϘǯB/VR>HS1\^HB$.̀NLLY&j/ͬ$&̺7bs.+ xY ~+;[ւ:|CZt ڐV+7 ޞB$,6C-ls0K.1fĠ}bO;M `WJ(a?@%z=HrbjwPHJZPPz+)^a(rη).n|38֥♑7]ߋ`؃|7wWc' խJ0Snq>ω$G]&|qOAZ˥T{j[QY u޾Z = EI8Go˨i,BLHh_0-)P!1Z9tϙ`-Bu2fJxj迠rjZ$:ʹ ;_{ ̝]a#hp,֍zueT3ɖ+I2@{Z'rYR~j8h?uV%-VGzdIg^Kj~A[S3GsHog 'q_$#84S$=,za%AZ&G6A)Ad6<61T"[ pZ2_={.UKv|K6CgX"d =]D+( NBp$ jt.fCulxL'W%Lbh) "%9<'Z3cE9='ĸ$XNm(4Ϲ~ͤ8[AO(/ݼ9ժgR"zT P'wA5Ÿ;ϑQS-IC`DdjQ~DO7 ]+<͛O huÕ(CY>j@g:f3O  Vy^H̔5W@ƾP;=Z(ΥPr$h&#*bHEnR8FVN eO0+1W oU^2E 싔 x ZKƿƟZ\%>z*$EJnU򠨁ҳcfL_fPz T˰i D aWp3z—Ȃbdzl8 6n* l m D!b/ _oo>.Lޙ8*-LA˯J[|42Tw+0Mno =I*Q\>2ݝh!m`~ mdsv; _ЇZw'0fnJz@)Yw>ek{&u^B(qy\:c[ǻJ3}͑8@-cSiYJXi#~:@Y7SfiK?wL2YB -cp̛`Fn͇  ї6X=q%ٻ6nmUQzl< 7I 8\,5$k$'ng$Yb<3ܸhki!< yx(%Q iBMVZ>VH#}?a&0J<b7!).1c[P,sn[ kjVԒ(TL.aR] cGὙCNIۨKyľ$`]SAҢ;dJ1܀ؤBЦw:.HBssYʽ3$JjIʸK<:T04TԜ`_Cs/9<ϬR[?$ŰEs}_T-xZ|5w'\+DU nt,ܒ"sm" ;ZxT5 $-gq6Ngj\XCy]RNwo{" ¼1|֫zJ*=w~QVcXݓYX?9]aj(B˔ߝp`b9!X,(D!"G6`6)OmFHX8uZPyhd"A`kʴq;kjV5[qB:kxG!B0 s<KA ;R!8dOnP͙<0Nȧ3״g((%Oe{ 6Qdkxڌay#k ˚U/d>;ɃUJ3zog^yAU ĉ(.x$v!T=g8CIṂ_!IE1}{JJ?{Qe#M-h8p- TӜxpB+eÜx\k:ZBʐ<ј1J"nhDM9d:vs$\sXbqOgMj\?jBބnA8C>V`c!w 2}0"|oGbonE3=H4~ãe.2͞Hn,HL jrn|k?f~xٱzp`8Co=i=aDP|f&i@_ܶrEO.*}  3}<(}y;o2}*9M-2&^Z/"L;>}ه9'uv@@iwu|8|:#{<9& RrϏ;W|fƹ ;6@ÈH@ ÓgG3elͯNgkksv2_;C)s]!l14}?)4Ub_?ߜc^$n̨}r|m?a,fGW{f?+j,W94/ `fau.OBϪџM硽4|{. yy_jtv;qiǠ{&eNjxb:P޿y׏^vW[|{_Hx/G׿ū~cN~ TE]&RYa|330iu@j}7S sO_l?>=z5\r]Ë Ҕ=f͓XLZX<{N~ș2*>ˮ/nyʋ̚nŤki6gR:=S>r3A`tx?x3S0&߾'ϕy& W`Ld>q]8ذP:Mk'J$~'L°luz`|}/g`5KXDQ"$^ZI4fU c'f_aU5׶&g0mLb(zF+H9le:|؛xse=xUy&!!!sOCq,Q! ^7@Z5 Zrhz\ҹqo48<{@!"yᘾV1ַ?pL$"붗2?; cJK1uw•Ql zR!A&/]H4+I4;BkDb F*H^xgһ`p4aDsm(V*MVcSS%*"2UbӀ<@(%(w|1&&C.^ eF .B`ܤVz(KU<@y  $6ZRQ%*y_rDf14c܅OrDr'dvܝ0~6e"\ܺ~;ߴyUL5I@D)]MSgH<-W_5_͹}rſoU"-ö4.ıt.saX2/ߡ]xET Uzr<ԻgyCcV>#M7ݝT@K[ĘպϢ*A1-PjicGnNf FP,Y#fC8EMHYj3MKnPkk5߉kpisُlrnl;.v&l ̨]NH/Vϲsް\V#)7q+D?n`x֛:$q:a𛁷kRqp~ Xj;>pp*ЭAIhywֲly5m6&M!%eMsf-V0L{3*ɸ3Tj(펣wweKs()8sT۵@./劢;g jeYȏSBn:4cdt8M|˦^㱠a4 hCU`IB~c~ͽuFQvpv(8lg>{spHw*U vn.RP^Ę*yE 5i> C0눾UAQhjH0^@)dShe]¤F”$(8"ȍ_@M-Wy#ii-UưZ%O %nQ:܊GLF5~X |kLp]` q|w W h}[Qi3ԉ0ip H&׀AUŠa<,_ p`yÉDS=fXY, 4*N2ƑkŬ2Byd("b8q[%S(0a=СIL]u@*o$D` Q*5oȲ9jߑ@FU#%"5wԤ*.e^[hci$ <nuWbuB'"o DQ)mLĥV`Dƨ)F82* u›PxnU;[TWKi 8|qjuƶrSl\8x98d"3܁9L 8p c)"`mY#UT3!B8?zzcBֿzpl&2_GWgB\:/ :AkяQ>Ύa%ZO#z^[U_O(yEÒ!vlF}Z)*ՠ8H M{r>e%hΧlAX׿`C/B4`.$ksvmZz:\pc1;[Rт\UUсE8zCn 0yja|#"ǫٽrٽ{^j6BPҤqr{"U 9YH#MV[+%VFk~Io K;+- :Xx*ZE4VvV5+* SЄCNZD<7P2p΅ߴK=z“OZ޲T“hqxUTY4~ְ9Y:v1LRH}7.ޏ\^ b2Ŵ) KE-89AR2B&qȕ2 6")jq I,vNI{Fɤfp 8EUD4^4ES E`-V#ӘqymYk 5!kf.TDo.!D.k޿3xL9 `1&*RaPFSE v)5NNtrCݽ"[^zή{=Y5+)n0ovʴ &1`|&, 픥Wt9P-+laLXw]fH}@"wVPPFT$>K`4 S Xf T&ۃJTϥjVl3B(SL*\[ql$J7Ȭ_rQ/WWrq2 ;]Id5oN08 N 8,ǧS<@!˝):+н0pURDAu^B\'?%x_rV)<v,h$@;FQ?1GӏqLih=EЛ;ә+0ɯG_}stݷ??_Ywzv:t4A==:tzuW^"b2Y.y+S/)ͽK=f]nK|Y!UuU\ k ]Tj!@'Q-!= wh2JjI͑yMcǸUT}vteJjm@sNt<9%eq}8 J;eE^`i.y6LŐ`]2,qpZqֳu;. B aov<\ ͨ_Zkw$ZaSj2ڧV`5h@kݚn!^u9?vhc*m턨TNpQkpҥ`jE~>>WHҸYSB.b$\]碲gG4<]|Yxt蕅W6ƫ`\N幦C +c~G_W-FIrK'N֥E=lV`Q+cfg<ƯQ2,7ba2e릹EX49E"<Ζ`sw2#lʐJS*`j ,Ђ^VIZLVYJ(@ LJE9An :gsO &ju,"(.d' Yt-Is FT! S me|Hp'DkȨ9Nq!Yk$)_.Ŕm*O81EdE4Ѓ , *gW9CU=CaU˜6`fuU4d e]vzC~ol *PQF;tvN@udgAEH.jlC9!UHHgG5tmoH2$,aIJ49)E5Zq7PropГ%W'9`q2r$$o<ӣp=?i3tw/|>>bGn?>|2{b$_FmfEhu?s:'F?x2]N/>9Oi53"p3;~߹\> \E[ hyt柸8w\(T[|6G]k%$ #Z sÄR"G]5BHmO 2?H .7:%f'z춘Wj7Ο e"EV%]U5A\Խ-$3jw̌aXQ;qmڭRd]NDHŭﲀpoK QKgٓ /'Ay'@J *!UTdѭݱ=}O#%׃dO#Ҋ=V袽V}U^{ uw^R1rtg-Baw}l*GлzBEخz^# fO(s0դ^+WfmQUA{5a#+ Yt jEɍoE|]&[wl- er{OL6(fMlZ~>8nC9O.ñ]0D睛l7,Nm{:ou tૄM!.z*6?Mz-[g$;oڄ[=V}]?}WdgkYYYh&ne' ˍhe]Ne/dfQƭ4? ~2v\2T"(;}WĘwϙ'nFvW쵈J1/j}m(s[3=cMb#Zě:Ϙ%UQ73"*ñ hg=_Fz,c"y + d"OLL-oepk|r}}f-,7xչӫ3q 񤺀O zi@UAEX+$ w1?w8)Njwhf[NkY+2t8aXhn{ !uѵ8|j5X$T+H>Au?Om-ϟ?ӆH)妰_=^ɝX j|){qyFHK"+P}ћ[Ϋ՚1n!/yG B/j(Ϊ{l$x%(J"L/{߻}4¼Ls`揩se_dcSj~FOR)%|%n]T +RXniRX*.wqӃr5]oQېbΟ^pd;=\9sH1TA--pܽ!aYAY!YOYF,4.3ƝY$Q߾?]EfBMj h03T>) aiyJD+4KVN6I@ \HNOdlK.~:}{ dMrc[KG[.! T?OW!4w0a{LI; -"YۨqbOL4/Azp84ѝ@e V:I]EeaK&<ev΢@ #WJ$Hh7!r/yI%3oݘpNL8t 7%^&&\QhwH>sM х^N0Eʄ>H ) wh!$N/yN)UB$\Ds$8#(!R$D#)x M@5*>9?e7rS 6[Fߺ6hk&AMm&AMIPs6i5rsE1g)4NxBN99m\A]TſQJ(UBRy)PRx[NB|BI;f#ȓCPg7p<>K|6g5?d~JB,9,qnrM05*H~S͛&6L\۝aQ+:KwUSP'v"<#Bn'=ӡzE"_3Nxs۷GB}ZP6,HxDh,P.ʕQх@HtkD:[tn{8 gllv=ٽ{̨G};G[eOkfOk9|o'ϑ]z{ȯ>$׈i=wGYɻbi-fbi-fd-g?=!ݩr'tZkMtZkMZ6,Y5CK}~=ķ~g.1tZC51tZC6C)HV^٣(_LҺP1)^YMP:SDVfj̡ !Lx2}ӉxNծa -.X/%藫ጌ݃0ٻ_.=x+*^T&l;a&QvĻ}%@#'l5>7ă-@E2H /~X2Ԯ 9xqݵ?SXREu >J Rdc^+NU蓌DgU#cɧ~->wRDݚ[h t4l6;J_^]}: ;vi@wiރZdս ͌BfvB; pҸkb=`XϞpUM>;ʸ3,50fA޻%9$!! +kRQZtV*d4Doh"P\);w0witS; pXO2#mP_Ol4Yɒ>)&:$)f[J!Ej]m`Q<@HYm띵XDDG,Z$Ĩo {596씘&$rЮ8ںZsAA5CXīEf$$+!EBO`l3xg&UZt~)}}1ӫKf/ڪ0˗U>=I7:QÏYi'2Bwo~~z`6_M]99> Doڟ9mE3緘?_7noUhm0뿀22p>|WwE!:P蝳N7ͳ 6r_'`u4W8ݛѸfQ 3WLoDl<7'ٳ;a98탓tjit: VtYw^mmtxMg(9GWD6jL*PiG1D##䆐еzb@CIqnmU'x!%#&:!{ &YZ/.CQP B`Mi*Z1%onx?mEKBxvy3vv}fLme\i+w[6Ydt4}-S1=dK&_~>q 7nx)H"3h6v!ԭ6_>׺]5qF+?M靡CM|G_thRo 6FOW v!yBGGCPjw2 |g#?wbƗ(zq)JRGXU C> TPء: o %Phjߎhjߎݷ ڿ[ڿC?#H7V#7pQn%ڊe{\]r;=|uD֌%S^i5Lm@Ĝ E֑.ɵFz?$t,V}B^($㸁(t63۷.ocfKJlR"i̸:{P'Fm>). Nv?ecA(^IfK&^$R8Zͦ~)Mt GR,1*\'j@Et vV=za6{mymc'eM‚eB]Ji2q^D0YXA 7 @`+F:y&rl4U259^dגy!AϖﳇB78j=Xw<ϽIq<ʩ*"Ш,qڊy&3&)(Y yQF{GC/K6%u% Z IU۠@94<__]5S!ahN<[Vc>@kmxNE̢gd4zul`LL,U2K0w )`k7 : }^#Dںϵm=smyLT -Vb+Q%rɶmquQMh[h},7븐2oy ;<9.9OX݌qg?M]YLSIl? Q} xoT<ċxboLJԻnKO屯ŃD|TkXF÷`,o|5lvÚ\vTru \n( ߦj 6G¦F&kwW90T jc+=7F4Yzq]`?ury~ʶG5`9ht^X>4A 8o@tpLݣ-|Ɩٟ{hR- -[A3 K/y̒.o[ndbRKN\ZB?-Dr7 HBhӀ]6"4zQ\#IjiUBcu+dn/i(.\M4Fj}=%h3rn" -RB}KwYaxx%VgH^Q|+ 1zvLii툝 m7Ɛ"n>WWK Y72 ( Id WS:lDxtDtd`0"[!iMiO;m J$&YkIK%=[[b2;|Jҁ]'|hi@(`*]D 8Fe$\?PP)gGD%9#we. |[M>D|$ħZgCZp_>fQf)c%i'& of,XQ,H:6dĢбBm5%6$RK>Z9I60+ƢQ [ AAf5=:B6]jBLZ>y3$F0j7xŗ6Jո6TY3)R +b"_[x  + )E/ZYOvb0!sNʏub iΒ]e DCPgqv;"xG)J{_2e/OpѳRv(2Z"i` Zo*_JEI؝KYއVأ%T7c%#kx `_'ۑK&?>p?w9||q>V#RjL0y)epog׭$Aޓ߻?9p!E au)"gu+`^HXL|R3(XEhD$ڹ B[>mH(6[IM>6"5ܠPɅZ)j׈ 'D"SI+1{A KȒd6:&agMyX%rQq1%.Nz6h((*Ydbw]ft [*f6$L;і;;ֵZX֝NN0ɬnMXowzv*1ngj,hËM\6DuG>xfv!Gm}腱ӨÅ|)X]McP5q5g\fšEI=78uo)5|Vn2QD̎ k^~ΥԜ7e()٬CAQnAfu!0ilިfG.30yXTP[!59 Jd@}qᬊ9rQyr1ȼB!{% 'W`Q;8bNqmNh%J29"ddɨ23.fj)(xT`܁PCXC_zTV8% C `YdLt%הHČ hD DN( ^ie` :R2*+@z0~ewڽ8sUZar}"),Xh>2&ULl|̒t&{$O:b!RiW;i tƻk$ۨ)9iQ,u̾Uœ#jQrQޜ{\0^ liK` F9f5v5]5alGM pʹ}d/RrQi]'W_s:GJG!&!#&ϴ* }retŐݔK^ZU'geVչ;ճ˴v0ݩ 1ڟLi aF#Ȟ*sI.mfD(Gƴ̞ ^6(7Ge۷]GeOA4`JO;;< ֞b 5%?r˺0s/kp<8H(Te>VS{yA6N5Y:D! |)}pCw%l/zS GX.וQ#ZZ< D= 2f1ڮ`zu(2`z씯wUxG׶JUN5 :;eL$sѡI𐭗siScdDǠ0̴!8])g$bP酿UAPL:RaR,Ó"AuIR KR xG,9e4yG,kxS&O7p̫.A9U>P%Ŏ`U`Fz4( gKU熽M脂fWbWό 3v/S*g?|@\,pr$ϭ~H:&,:Az4O!1A-η>fȆ2xηY/3^zc:TxN#{<&V|Y|XP.kLbݬ>˶gּ&<&f,^Û1˶6 g'f;i艎*hB}U(|2Cք![<ڂf/@ڃzfq`%g.G-<|cGQrܿ=2W(vU]up]'ԋʈ˗)+6P+-R:gNt('}ur7鄯NW̒2G~ۗ5#%&XhޮD"`Q=y,݇lta.?dKޔ?)Oҭ/zj$ A<6ڡ*]mJ!>x}'c\Y]Rі^<8V $ItH.̘ 9yw7PњNt5Tx F1gWyGvWWU;WU->VX(_N\Q}5+t 48:.FݮZ1V1j)&5qw9mWJ3- qYMD!G3;9amV:(^,Cfr褭Y<"&rB,h9+uELeFv' v|!kvҞJzg$g_ZV%FSs^EΑ;nd*&\8 sB1˂AtCЕhL5 >f]K|O u7<|_yh-_o L>פU=kuu(tƅ2]cM e,U*Ii%<󂄋,ޤWL{Owm IOww1[_\%wgO5 %)l+hImmܟV4; jBJ ^ )dr`9HYIA=sP, ÁH1h4W6FfśwWP{?pp@ͨ5]lYn_G~5ю_`/DZH葫Z.˵nf備a,4Y/@|Wr?l<|/ U+ +R{X QOSIv}{K`90e`y,r36CHԘQ:bڒZ9+*DYg9/#ʩf??l!hN~nor^Eh5'K]yqʓZLr׿$Z &p>ZUvBf¯*t3e!ɸ^2a760N$&*I tf?w Fn$PƮxDL~ߔs_H&mC9OS9f*_ҰFu4q;ݲ r>i}* #YS XjU݌;Xz(~7qxK{q=ބ pbJR{ޮPh2")r,L\v*=!%e%$=¸K[C&u%Ns{$3DLڀu6piB!U{))7 ¤3E>S\z2q@\%_>SiTeh.pR3$tu US}>g7K2Om(y={%w?i`,F ?1_)[\K}_U;vqT6-G8 M߬OCd>OCd>4=/OC0H : <`9{pL̠Hi혗NK҆VPҒW9 cl;nu6twg?l9gߑːڍ\Zo<[ny\eܙٍ~NwXaGnώ"U͋Bո78L(yXHrI=IZY\|:sEܹ˳|h-Ȃ]@Z/#; D@Jyx.2%Ie+ d3* v}F8;ʀ&{ M3%NY]c86/|D#x%99Gim5X3x%s2$.  =n`[c{~.x)ik1`Ydl5K&s[C$b`@4Q߿?F+$S%Շ7Jʖ $yاZ%"7wdVR?ꓙ]݌W_Ym&վi9\&3±Kk ")$E^|r, 'cW4&"IvJQco/ڑY-Z[VoC_e̝| W 0kǜcoFp&da3VPV6hOtDefbꊀe$7(s;,[ {dqifY)yG-X]9kSL_DQ뻎GKՃ@0ٛ/c),ZhuQĠ,"~A.@*$B%A#t^#2%A0m#m@'H~u:9K,DmnM+񴖤kGbt|nIQtOIQk0K\:ŧQ%}+ɠ?~qruL2K!ۿ0[#藊O+T=t>`'F*pitQkʞLַ}&">{'|Fn|j5s+I~]z0Tǔ峀TP[|.xv J*Qi. dD(R~G7jQ*rްb!uΡ>Wu_ !/aZ%EJB쫕ۋVk%@A_TX]hW}M8u|{SĞswRbwQ,RVg vgNb , ІZdS{k:à$0Y\_$JgNwf'\_f^ afk^9vat#Eu1k9&g}֏3P!$o%qi ؗr 2i;enAnf*+iֺ!f8ܲIat=t}ȇ?d"I 4[ilGhGmsVH[$Jtf]GòhRo]|{Qg+K &*R٠|ad&(/L|Ltƥ*y%r#>$֜A&j>;KI( $3mK$%hzDꑻNR,cQ߫WrPo7ɳ`>#36E n;e|t a齤e2:OdټUUxxc5263~Lczͭĵ/ύu{ Fo$e%(jٚ/ww| ?rc I?7w_ZWٕO>]Sg)g? )řss]Ne vyA!)*zpskBhBCB˧,lJLY A5݆58V[>KJȀ(X|夰'ad#LP2e^GMؒ %kA>'hC5 i{pt{[ss;̀d GX&hVA4riT9mGΙR-&owWo2d nnPRx)&\DŌ.Ѐ%0 絲F /^sA@R=q/ \~ h(5JCA(֧-pT\D:A-ť!2, Ij4tYY% $j8(]Y{ VX wIUA@J=+xU7|b iV鑿_'V|\(x\b?u y ,JŔ,xNs᳟Ʒ^~{6"@7~(gj/$ˣo0dK ~11,LJ#Ah4L'Qt˳4C(IòTptANF-L.%Hɍۥ:aT2h'z @YAX,BYLjqn1m,(ݸ;5xz`DK핷F*>ˢYnr$YE с@V޺`P0SD\iI- Ja4S?, ,a FLkAհ`3%RbτKPN&L`tCL ZKm+K7K"&z&={pPɑe(Ez}kG rM_)dDk3)uG%&HtaTI h!E%E aja E&\`n!X ͜SGF@Qhe' uAQ-l wa|3 0+Ў)5l7JhE*%lΰbcA?c}s~9TO4ސNg).w>& ʸ d ~ qZ_6wyJa39c,_p_\GqΐW9Oߙ=Ӿ~V6;W)G`J0Se?%ֳaG3Xe%Be%=LZR9[ٍG{Td\d^}s=P:QF?MwѭX NX]>=8Uj>GN՚u'_{$ "WO8d73@H5ꠅI^A`MD2 1:s!IWXyfp-_}y'R9t^=xi{w{4 BAG_d?(tp3zaBʝ֔|<-@;(+I)+Ij.aIrE#X'՚*kk'E>P ۿBU*+eke*hN/%(uAR:"Uɋ yW^) 6݂6YZ8rOdRx邔H)gʀS;YC*7K;n|θvZݿ4'k)H30J=-0" XZP=8aL2L=qV4٥)KVϔԌjnrkI쳅NR3ct9#p:BjŐJf lVȺR+vT6O>OKmHV[}XѺ @`A&ũXPip]V!m׈>jB?{sṷ̔ *+9ԝIҡ\]KveQ0ֵYۭp>6Bjw nuI@:w7fgYFtjSB@*$u_c*EGװ,*˗ÉDl3fN 0Q[qdZ12L+zof[/+WH& -@1AQIlW r.>x}lgxFME_"~M~]ɀ*,ܬ=_yY%~KCֆRx_! 멝 ػ`DO`yUdk^Lz{jٗ$xQ'/ջ ]{B]M$ wa^dPHQ}UL1aIۆ>]&/QȌA6¤چBɮbh1gnzG")y_nAp;?[Brh%iB~?/f)OIMjOztuW=j෸Ms:Ky*HBrm!Si2НZd)!"[IGޱޭ~׊:jгi0c[4zdB|%mmOoF?_ 8h_Ny{YgK^I:,T4]*OdtեG"kg(;= v>  .`fjwnAJ9WN#CɬWn.Svx49vդp-2U&T^cʚ)#fTǔ PSX։[pOϡj OAd7ਰ$0@$ E /9h4c6Z!.M;|fhiL)9jh*R/a!Rz/:RGѭu l=O MeSv]wdP#<S7yJ.y:e!J;g9 3Χn"EJ* ":J9Q<`+@4qsyZhA's,* bH(! "0& Fg#hK8“6ho*ȗG9AVd2C&4Kb! fBAt Zmg22?bC*g:3ķL^u_Mozk}Ia`7w/. lzYz~ߦۻ (.6b J?llXq}~;]R$TV=Cdj5NWEn|/8>ew@BxKARn+L`4C%N"XTнh^9h?RZAW3MbO9nԜ t2#z=(ϼ@S#?JawjK (^!F ^7 k^9ǎ]ko$+} nE{` $፝!t^H֌ֻF&H-TX$NႪ\#SWHS f5C5&^xږDDt8BA.o=ƈp[* >}`$xn u J>=Q8$;!l&4-$҆WSEU29.0[3AZmh*/m'ٕ@Kk:޵ԢcLl4\Ok-:S ʸZ4YkɱuH aGpֲ FlyIe 3 ֲV%bau?שE: R+xn*Lj(&QVf$^X g-9uUnO<^ͻBC0Sq20׻F6zNI,'{]DUSt ;:z0دB1H*t^~= d6u  "pThI_p=>9뤺<.ޯlz9.UnnOolPmVؘT6hMf_ʩʩʩ몮;X\#<>uf7 NF1n0/msJ#4X Eu]QvBbIu{}'Ձ w( x ԠwCR/àRN;*q TNԧsÓ#)~EG=v.gM[bs;%^ t4j8ejv֐Q}c"V}Tߜzc >z "䌶y_a7!ByMÃ+;B;.W߹z7>^-Pr`z6Z]ѹVC[P]IV' t[ U}Za]|p )dtG<Y\ݹw'DR)&56CoHAH뺵 łfpQ<"n˵%+ݵWB Q-uTKڒvY2Y*CBg*ӥ"Wyfd8%7X "2עԼ3ܝVE6UAnUHt #Pݧ<9+5&ZN7};v  #dLl$6:Ĵ@PqqBP% j$:+Fj <$CtMNm[  цF$ Ǥ֝*hF6ZR63yUXʾ^IIͥ%KGoL=YdM?}U[2ߔ rha}Yw~V6J hL}!qo3BzKou*jYgm/M-B^ؔdr b11gxNHNw$<[ y&aS!:-U$G>#Ļwdt3ORX+7"F=-Fwkb|FPߞ*#nљ[ y&ܦ3kvy>w5/ֶ^xӦPS*Ou;D7 nG4[oBQfhWF.L\LaJIs&vi"nžg \.E)" -" DM; 8ހ QK_^(9,₋wC~??K'r+ 5RۓvݬAsJZBłAISX_h.#Lcfn֌3Ur/4vCb(.2NkPC4[ ^G쫟8RҳC,7طk]T %j 1џ]ß;D.UwTwTwTwUW]ajKhQ45%8.r!Z: VW2㕹aUj/>U迗.Mf޲7A2t#g{J}42e :*KR6||%@..R[tqXK̓LQ*AEry* 2!s1(Re IH%2q.t<"va<J!(1rT;y|nF!J`'}=xrHKPuӇ*=0@ yUj88lpil_W/9ٽ8Ɨ>YF)JvF0}z\LeI[.h02yN@ p]3 [|{m AFK[)tvŜA lXF=R7_&kT*e#H[s ۛh 2$9J++7JBuU:ڐ{h-s$ Њ*rl$ . !rT\0^H0LQ `zpo2z-ut=qta:jv1/XΫ́d8w }N:ט*)qX!@ڍ}JHJ;WZ;:n{i.R6(?οwglzSJք݄SiƞSy vH'^TE=XZMPfEIk8|}o{by\D3 oG֣G@_&?'S)VVj y憽ǔ#QQQ! "cUv xEs+P # ZQ+rHV &d!px] 5E: Ж 8B:'B2g\j32/ 9;-FXM~kuUZsC MQB Fg{} aGE 奖c!( 9ISMNd)QG`9TW0g Tux{s^f< LHrryf=%j vdηA\@mvەmEY(tg(, ) + P (_ŕ09.7m샢mc؀-lnHԶ6'n!)NrLhFBL!;51\`fq2*(44b}NJw9#knֱvZuEC"-O5Ӓbm\xlFH͹es{)4qzY{,R7뛛uYgڣaPAMTOS ]qbHjTa`ٿr:cw`q0_sűգqFUϗoPB|{eFSo% ~ww|waiqnAo/oq67&|=~V>N&z'. ـٵ{0'.5RddG|== '`ᚤ6VP;ot.ʮb-Eqo*L5ϸ$\""3V n-6*VH 11G*@00%3n][`[ 1j-H zE=e5 TOϕ?g׳wwy" Ѿ  ! Qpsu+C^n0Z^Y atJհVCN[~P1%F{eeת2pKU'L. DU9ujoN yWq)k MVB {N?A+(j}Nn !;@͏$}`ʏf~\Zy&+ˍTn1-PAMa2mC$l|w1XW !1 )sa)ba䪇o5Qv}0=d*IUQS2e88s!F[yΤXesU fV xs  Eʊk7LT66 w&7|U#$HSE"ӌEP@Zxmmǃ&ɂ4$r6jʝNN'au!rĊSJp g'HI$܁4A$c,dBE%ABUhtW}7rYAB 9]RYULdRXl,J4+r3 -J0f<}mNױ{{. As%)`RxS5sA?Sy0 X@VFɧ.ϳ&ӽ.}Ams,P7͖c c ]<{m - ]\2[,3F'#|+c("l8/tlat8_ҚjOּi_<')1Ԅ8I 5ҽVZ|edftjFonZӾj(+2٪`cUwzN:kW.>.GdDeyTϵ,Y71%Ho>f?:he!XrQS!;l]w嗷AVֈ!n #Rͪ4HI]i̲3bEQ&[9;|ҮMD<$Ԙ!~q2zqo/(pB[ ;omxtmP4^z.'JƇmTT[3˝:g@:XZ %=$3Lؐ^niDƲ^'ЌY` ?F"syV(beEeUٞg9x5os6UK ^q4Zd6뼠߱Y7\A!'Xy5PqW! `=u!vԂviDMv^p{ ~[@,H.K,`ZLR l3jGWM->O9\UjEG)?w{ɐ#p4ƀ6սSe'E[L9ƃ@7#I_3%yГY vvlX̋ !O6n}#yT_dVE Օ_DFFF{oy]lHP>j'[Je~ng `sC4hI.fv Ɯ4F;Γba(eFz)Lh=1^>un^Ap+UDੲƊ:p6̉R5.yVCexABo ;W[F/2@z4 `pwËd~/8fcM' "*!}#7gןN/1㓿LoO{vO2ѾՅɹOd<;T:DR33ssqWTDp(e\w\I2,f_ûo3VPZ {r i/5{GQip!VNv4L)̼"J);*NRE*T"P9MRZ/i[b>,T|rx)AXTL+Թ 8 Zpo F8U#T͚{ $S,eZA2 2G u4&XPZ* @+zd r%X%#mFQ ʕ%6ڦ4&2kjVhegH0/9h1Bq.UNhV$$Ð{/t/C`P-jTο`&JյoѫZԠd+&b=>i$I*8OP-mR1qT6MU+ᅝ'N:, {zv;4rƞ|"G'8c`]ĄC)+ Oe*Jb,eGh4Ngox}mMO7_ /uJCR6$@%+(oM<- #åE=>.(9YN<߲6ZNkڪTc)":߿}[B }wIe{P揾Þ_=]+Bɟ 7N&t:p{{jg_XI( X^_\meEH=9g -dO&o_͗Ҭ %bm{nz/AYr$FEq$u&*bYPPuO!!;m2bG&-_)]WRo6Z])rpp (#"*y Y{­t@p }{^Ε+<9'5 ~ZuhܡY=8I2Ͻ"#/ƱZ(^r6lA&٨M1V %_+B.}_?@5o %egyqxYA έPxύ! kɸ&4@X^Mi)BD 9r@DP( s=!' &MuTтp6hZU _)^ZxVf55_iCCV=%N["[l%Fbib6 KII$:!y20IIH`e5/ͥ }%H&Y߆zfTC}U-U@uVGԤ $"$mHӉ;ů˄4N96#+?Ҵ=>Luׂ:Hg$l06YYPzFSکn@?-,|4,ٞY֘))L^uρUP-U}pٛtFlYW:J-s*]ҾGۂBZ'w{>%v#w.# WVa%Q+(݄ C u@u{w]MgS{@ٌxcFvhIʵMxej8nW, E#}`O6sPb-1;O<Pgƈ*Y 1TAMfF^P6k4JY#"YG48me£ mHP(t܂QO Tlj%/fpc~?TW"Վ1nJQ+{|blS E(o /!:Ʒw>{ٛXKPy|?Jurdqedm))Nj.vͤb5.ؗ4  վkb>gҰY[ ؂hڲ沅II_lPpr`N&h->PZ,­![Xସ9#cCSƀltIw$ƳriuQ9,RTY;[u[pHe&uwn:ʥqU>npN[Pvs::hp'^-Kް+|F0 p"~?mN+Ř ҊRDǐcŦS5>Q6po^f[F 1Mag}N`Mn%0_%/ B9EwuH(^ڄ 8}Zt{w3~'|OhoF,X8u\ey5woXz?|ooKZ+xeXa}eYYa*f*ߧ7M 5tCg]sV.+Mt+}bvasE!BtYcTQB0s7jl$$r|,Qp)lU0;;CQ-6[&И8Ñ3J3 !),P  Jѵ2y0#3\Xf=+6GZ"N`ɩRV/)]2#⚋ E=DF%IuD5/L:a ._5kR\z@!FOZʣ6T5K?^"iRkm?>J0((Re^JEӵ e g}Jha rNDfk,Xw 3 ᔒ1J&FM2vc󪱮ʆ*MD&+OɦMV^5YI3*\Pąи>O횬T`}T(rdxjJ+CuӧswơJ-D A,)ؒhf 1(@,@Sh(qiŧP#et|4p<uofv Zj9PuHTY`M`\^;<@aR.q%Q]$#P .R1?F[Lj0F7j 0uj̀PqZ7>[RI@st+=R^Ӹ;ɍb옼%ѱ!|`y P4 Ձ 6򆖌c\k 5-+57|5/uV޶־M1.on^cΰ4t}.~kԺ?U9qXsy0Oj6,/IId[Z[<CX_޽I/2~uwz/́y:yOH<6$D&VA0n+: u i.e0tt/.CJvN:a( 4]JI󄚆5Fޡ t7¯Oھ(罣v8Sx8_~:qvz? 0򂒵ZX F* VeKCIZ ,2iN;j쁒1r- BO/ Z`e7`1E EZ RpB֠fPT4/(E`%(z[EFCF@(J..Ub1DDƙPv4) =(͜'3ВvR'W{1\ztڢ;LJ"= IP(%4<J˚ KjaPIWNj#^@7=/򌈴}$ʝԔTxu/Y" 9:rdj*h5EiRPeL2pxĠAm%c` jU'cM*cRrArU=[R@h4.c  C3)2g"4$\~%&Mۙ\ COO{B ?w WZ6P9X[mh\ӳo1ZJBA#賲Xu)Z "Qۃh#3RF /M:zMi~q,\MUWzrw34 h H2+K>im+>K̓ol'aIe:HsA]B@+|w5B+(/ 7GKQЕC9 e0J~ q/:&`` ͋V@S/ޏJHUiZzvW>4wd' DPN}{$d&D;}i#ueh-EtsPe%:-#\)"̦:Se3Ayc I%*iڬi~XGx,2:, rN`/ٯ/7+6{<[MGgﺹ87`_~Zi@al^9e ( :֋l&׋@;H6n{7ʷ<|z{oؠwelhmWӣ38EWB866K?pM9! I54G;:GX}ftUaI-(־ =h wx>NGB\O[EDFj7h\( {@0?æW܄u< õp8w!]{|oW[3Slpɬ սȨ9A,UN]Vy!1Q 56A2Cýf!{" )vcK Ǵ`fiS`L8h-0و貥hfg*ā`ȎSmsf6xT Tz\R"vŝbFRoe]xߪ?[*O⹰-7gG2܊b^˧ Z.'a}d)CFr\'{tPoo~=tCF_9(rj~8@1ʆnnQ"8sBkB,u <ȒqɒMl[ L*pup\fߦǿO=6B/>/fv[Ӕo.߸Xt^Wa1z9$sC y]b)iF,%*]Q25SR# S\om( R25~K E!RJ Y{SaqM_fs_Zֶ P1` oZő1/6bC0SqY|u#+5q.h6hwMぽKzY$?e٧ðQUPJ uOMźiN[׭Q }I`.QH*yjie_ٗ6RB,rJ4Q=\ Fr3!0PB!bԆK0JS<(T' QS;>Ք31ȥ|p")ԱH'OsNsa* h ȵ.FF:689O\!iB֘d /9l05߀&f +HcjF q8ZW*ǻXt\+3 Jr3诊z.#+;y`z5UDQjE Vc|QĨD!8Jx-Mp ٪$ǛIMЊ Fp!3Nx_-2#p=գԚְ_akM>U*OȄL9Jm^Y|F'$Y.֟6,R(( -2>Nry:[~8}^c v.>O+nqqy܊ ;eB:\K7X , wY$w0IڒMl! $cVIt O'j^$3hj`^B"#:a@D_I ֛-\%IAxٓG J1Q`iEʃRӺSiϬhAFLSM>ɑVTپdo7Gb'XS)FH|Zn>@W'ïUGL/ '>y?X?LJ )5ǟS.W6gD}͇Sz1))by^?ݭ]g(>ws\s!NOט|wuE'}aE>QV^*R)o^I>>Wz!by5v/^kRHL$˻E:X.Vѿ_?|ix-ߟ_N{'OBOu}?աfLݙKϩ_NݕNR;;ZX2Pi owD))rR"S^>aL{͈o62hJȺê<=\m@5㢱Ԧv鋪"- wK)4;o}5|hW(Y_qR f}A2Z0Z(-h<]ŵn]S? KpPx?=wOI0vN>oұt!z^"bn‡x_"W)1C7N Rٻ$R\0Ht1DDVQ$#$~; @eب{ϮLL0!K2HA)mM pYbR7 6$Fsi%p 1 \A;|Es.TkuP6:@-бXqeX49-{2 k>`pg554\ײz4ـ1is~ yopH@V74 *TJ\e? V㜡;kVmlX)(F嫒oUXUf+L7^phߋNt'M/K(Ioޢr+h$lcB7f\{Ӕ"N ;b[(si+*7zUUu~{{)M|sw%)تE)ڔ=]d Z]Bïϗư/3fVŷEt7VɪZ_nƜ!Τ޹Xl;S]^2kwZ1.]L*>=.>-+CX=V|JOg\P}IZڙIѠJɥcH $NxȥӓKq0nA6w J8M'A8ug%]o^j[CX~Zaw$)@WOnQ550*8&\u?mO rt:7="bW%TGƚ+e:-t42}֚MvpB4:kTo{.ViQQfmNsk/G( =]pPczra96aL̇Q.-Cy#藑w9F=hT\Zт9<ͤ1ag8&$tAAVsP]9vE86j. wAӓ s0do' FX58Jш8uBbYq'c_>@Ε4з8!djn- uD2(2k)ht:o6{'z#Xjj'}@fAdiشW)ϕHTZZWYmbL50`A ^_'{tL6rcP] gse32ަ-pWS#1r~xRXu Z i)(m.H@0œsR:. ɓ|TRĬҌ@cJjWW?Ypx(+?5T#0GG-OW&f_Nim2iL =u0?d`+*mFDбs¼[R}ʑCDs<_䋰:N( Gv?&C\F]\7ַ&{ A\3pnܴMT<`s;lUgY6TbƱ"Hl]63L66i0` 1 ԌɑGzZwteSnwnjG-e^AP1R2\WG+GX|Ϧ)&oM{w'c``s-ⰷHt@;U.A{yH}C"}B ;6g 8<fm(Rv ٢o>^}ف!D-ߘ "qC ; R #oު{x3PT}e#I5oYYf.;`;vHޝrٍ:svCw]zsgPo@,8߮uesk컋sijƳ5mhkNWQ`vNŧ{/W޺Her'w7'׋՟뱯<t,'qN}qws":wooN޶OgJ8op][6+y9pUc{S[g슓qJGF(_ J34ԈIʉ=l|  $93@j >,^U+ֵ=G\9D/QLg>ݿI;FFFv[4p\L&Tb\SOʨ*H#`B6Ox|>x5Pmdw & -x]̢m d׾Oz0r_^]>P>~(ybJY>d 6YmBƤ r ICB;wףo52Xu̾$fM穉)qӨ>nXµҚǵЕ1r"OQB ӎ&ch- m0yJ)unK}N9dWcЧKwv׹Gv|K) "}Pj3t$oZ$ Zsn,"zsUޜ͟MHt|NcazR6{03or],D,+DYL a}U-$*t̨q:Ze-&ތ/GDziV=j=4>Q4#TSq ~Q7#_I*Ddͬ0ڏah?EQ̚qBpdHj6pzĭ6J93M C~4BP&Uj\ g*g4}Q/ߒ\Qx%֔n%(mJ"gg0$X^]<.{G.=ϳWXy1\^tx^㟮dFY`:Ϳk>=m?g]kt;ɳ@PĹ KSGEZx~"؛X5֨X`M*bM. ȳ%S3+ 5ASXLBJR3 p !pkd|uu(J.u(HoEl1@tי,1m LѮF =Г),TG'zPVc!` Bb(ZbFY.,u)"/Sy䊢& 36R`જ Wօ:,5Ⱂ|J5V`-ˆQ eŻfcyW9F<5D⭊ I)M ,'JJxYEґB*&o:+E)Dr$#Vk15>HGH8e 0LoTxФί6qx#?5h>l>-XD Ȍ+Of09/^[OWr#.n76>w7vV*pU0aTâ ,n4ycSYL+.["_ˮˀLz 0 0 0 (Nnn0mVd2QX{sê!_e/_ 5AHrqRQmޕpwU_c<-!8>#/_߄s{%D))O}*LSDhV{ Q$@j%l쿷$`Z^`+O}S`@om5m[ZxE%E^O1D.JJ JM rBxA0^` /EyRKx-8:4RA DzDn); eN$V[YVJ4WL")J *wH>xaO0< H/@d_=|$ך"R~r%9eeRCO֐1f$*HBv= Yrcfv гw&KiQB?\G &6b_;n<c.AL[Dw'$K\-pbC6G.ħ?IJ<\$Ӑةr䍪Twq)df-X1*:r?j-@ヺxVɓmʚ&,=7:A V;Y̼B$O GS~z5ϝ+7s/8A= 2j>(k*o0%sC Vܯà\M-`2ǔި Ha,6u=McB{0/8ZdV=˦ɂ [ 8Ǽyo,]yǵ4 n8сc+E@Y%}>T,#"Y;5UJrM@@p;릍-JHMkn n.A+z fzaBY+BʵH)x`Hډ`,P4;X aFe1d $ØVdf$m8nEVĸMbC*OCDfTGs4Z!?Ƹm Bɥ#qi fT [Vjզ[i+-Ӆg"G]yC33L_u?i~wAd$`{ѻ*J#{T{wqHrOo)WbsdQC05@4 ia  aSᵇSM4!X:N QA3SBT L4\erY?*s<ӥ&wvCC9UBd;TxFjBN1 Mw1ϓ*9$;_g1Mǿ,.@>C !]Cn5 $;qygtUrݝ2(y6+)s#06LJ:v<}_Cb+|dUW`, N<` <:XtZClm]Md¾vwG 99YJGRwR1l0X{DXjLj,sIE>4bL?Hq`@$6^N?M;7\`1^d~|[.fnqVbrUUkWo/ޏ&ˏvL?ܯf ؃1GdOe]=25V4zĊQҲV4x֯> CGr_qtXc5u2^Kq7&5A tv;u} y::r8悡Xg kф92b=cF&xy*M|;47#bBݵ)j8`Y*}t_=*횫)cVrѝr2NQF9"G??裇cd<6;,sqTSYHKGUo'H,<09*\_mw1'Xw[xP-M߸`%yA2vLp2XiDoP4먳-hqu$NQ)&J@29ZPH;ZWGMھ9jZqw#,IA` #Hj 7ԘxS4n1iX91s>՜ ۑ5n.BJ$ eB}pvV` *j  b<s<1F Fi@`q.;wmR.V7q0Պ;:vpgF 5ZݡC`Ӧ7cLbgvoC H/HO#SzDe7ڦ(v][t(6ve[Pw']C[la{፿Y7|!}/o}60&'B K yЧbgM~#pu >d؈))L-UtaBal9$ +sw20Q>>h%ͯFnԸf(fD38)3_ A9+ǚ{eE䪗1+bYcIV*>T*%aB?,֥hYџNxZs"Yd+75SRcoM~yF"Q.վKg%nΦcr'Ş-X9oJ2aDIaNmXRP9yoY b7gE֪si~N>ۛk$V~;A:WMw7!\"Nyib%Eyvb"D|Xcm@Y| eb"?J5?붊{3:Bv 䈧 c*[*r#V¯TDFe w[?jNjc:߿j$0ۋ;ԡu5Z4vcfG 5d1mDળ {jمAll{%K(BoӨYv cOmb)O4,&h, U*B<"lC?"}=?_۶U=saT}>āݾ`qAE^=(~I/vhk;N_| @Xkk>[ع|(?^U ar2C}My3-ɒʕ(| 60LBX#'A!R]?a:q6ג?[7Px×(iG}+ZXIۗ@v.DQiDjc 2&So(e2fJDj PL\Z$2ݿ/10m^)rFb+Cr޵57rK䌄KUzfss*/Np]1-"s44I Ù!ҺE׍F7)?ا~DEM-ᄳYXe.oNa"z#V&h..W|cM.s'8# hragoWKq~̴4$=fF1GA]/ [MP)l=vSTG/vϿ+nu*4񍦪SYӄFeFsEVa\)sԊ$w׎gSJCL8?c۳|l{mYvق^W_8N|aQ ܉+Q*tKj\Tq/ !'>@~0t{'M“ I׎>Lrl& uylte:?LCηEc цg bV kNjh3Ȩ ooY?.i-Cn{{r1u'ˮO82,>0xyeAz/ >]σ/ }i%-l(8&Hww顜2)pBK< .XwR1$I;‰9e6(#m B-7P5DV6X=}9~hI8₧Fy$"fJ%J"ukvDa:ƏUu VB%~p `I(ON-nZѱ&cѕce,МVy>ib!$3`K!Dd v)(pFI9mXCb6=8 sAd "p΄5QD%֖SuN.Fg:q:eN^dHG9𬦁knz@;~J߄w|G5Xus1 W͙󯶷EB .*ٶN^hMTmyE8u|hD[ayqM\S" eEKTтڧY.xA ?n'%5lƎ9W{Y }&I&@IĝhlTDf0PNSGKitRƁF$?^x>C(泣+s{o+R^7PC]{J dUFe¡Ӕ$PD'$Ei`SD,T՜LEךDtҳ%ͻx"3E@&3{.3.QRvWDbP^9"*ڋ;OjuɯD")fto&]vK$oD2F!}їJjCр+P"_Q"w'F\R.EdCc'ZRfU8sUоkwzY]V*zxSdT:^8)GW9]0zۃ#x xH8a{@a=!QPFOyt :d}2DG>zՑjRTH&qx$$ĝs:fa`FMnY:w'i4jsXՊI%@g((IR8kx0NpLh+ I*&=X-7.#Op`YfT )\I`蹢@RɅæLv -_}f1ܧTD#[hj]{QUv$55Lҽ(nyc1{|ٍY[1klcyйX'r.BijO (A;w1e!\PNK i[ 1]nٌHTH.t'Ybzq4-fHc?H@~ M=S؏dn'YxH+UA8$gjz^U6It9T0rEqق魓0NG˟a"[ 6q1D9~.xMd%DD%j_JD$/dmJ̀^roֵ~jJKruU496q 5~te&&u@C ׋W,]}LSfLmaGuEޥ5ێFa`6ȿg; `WՌމQ}5zl]9FoܔJra&1_WR W'-#T[ewX*UV`ja+7K C!3ԩ8nס~n"PX挨|DEgYgy֟Y֜O@A Lǐp X@)C5@JBr.ZiLDOTǎ,f [s_,PnvmDggIAVVH^{LRB N {fh6!;.4Jt4S,zX%-lb_KZ4xS8xs oPؓ)7GQF9͎Qj-T1c4yFPf@~C[#o!RB MT:ڧDqʬ2ыV4Od3T6 ѦD%qFl)%H2Ad>!|p0T pYPh/Oo}LMhQEKQJۀeӀq]9k$jBKJzF -#6rx+ԱdUkyqMF1T 2v6k] UXs1/ @)gzC0vL5ⵛ/kܚlFkv(D֟J|_[JVRܜ}./zCc<2 򙿝_(B1c*_BMxH:msc.ϗ ˬe梇7%`️8%]tq;gz2zц7G0zI;F  f\,ܪ.oBpM{CS]v{zX;u3,A1b9ٷqCAIm;*H)eI-]G"+R @aj.$ vT6B~U̧Wndz>euk$^/$S}z/ J0= ytB֐ƨm\ O6{o2ozm>u>so7N?-/SQm"ZtdLؤDoW jij ]=oY (hfm G׹2GᵇB^;GB27o sA:s7n)V>F>rn MtHZ%W3JO7`T R.] $H}i/%c|i'tWD440 r"z4ގ"U:_ˮ($Rи#B4GCLqv^oggpA?!BOWCLbzc^_~=\%ϵ Ops}{K)`x:m>Lzܱ:DuS*$#B7Wz qzɹEs ͇JUSZ)8hpOpLǏSq?r/v:b!r~Q'F {ƺ?nPղAA h{) _CkP=2좆/L4gUŮqo|e6.[yZ /C! F8u!8r-`C,g}c wULUrb8%kU#K{Be2!IBQIKτViXP$Ow?=ӆ/h}Ed}|bY)sggk2yzsDH ^d?-fH̒%ZHƣO\uvv/("ߥmS/&p^&`kn'4?Z$ϧi^={ja_J@+*'i'^jjuPI]X1e vӋkyRB"K04)Tn=::y-;|k}`75 RX#*Bj|eAU&aE7nEe?wcLk;Q~.N#aҀR)NrE88m vmb(%uIA̋y M%KkTR)A() 2*Ǣ!+҃AvP6G֢*Z@6,IZ*'5ALaͅl*6xe#M/I?pBHmWWP%qݰRNYSvӄtӄ@}A]]>@:A n#<էi\N D:u~p<:ATbAb $gʕZ 87IT&!Z|1)&OoW ӞB ]7nx@T;z!\ra$pB<"}ʃ56޲b27kC* tr 5H)D^PjI W11)eY]zobfV<6DPaXCִSB"A"(V!xC[ b)Ы2^H[OcKT@Yv"ZO'Ujܭ*i2HHah}GEl)vhb/>[[T+ӆ>`c,E/<(0IQ漯%N2(ц/rDopPT >h5==?3 #d}]7* Kr6 Ӗ_ :u2)+Χg%VT4%tJ:F}s#6i$< Q@be[~\=74 -Ɖb;C8Z\+7R}E*SbgN)Uef|)Z@~ ,KNz*Zvx౜B6=̿f~4n>Cn c+Ж5uPOȋRur »YFH  Xm~;HiŜ}5wFx>LNKM#vp0,bvUyPurZ|:X=mQ{qP[':ֺhKoXz8.]l9Yc _r:l9nkiHQ;q{rӝ+p 席QXSݫ>};OvwDG20}_39;`K9tBNwwZ NPYokôzvɬ F{H&J}{1Q$5ř@Z\"k{٬ґE^"y=lY$Hc#1GG6`)rXxri 7TG$JBPj$"A:XlaYmnw K0&>`SnDpNHqΗQ] ]KA R+t71V%;(Fȃ/݉>2w[RWJa19OMKMgCQ/V@]Ęj6vJDlG,5v7婢R*(IRCWj8O=P86E9_4ՠ]u xI*%k '$hd'=T*tiVCc6'-i)¥qs6OjH_X|xՒS,[p\Zr&xs]6lmd1s)řbe1Pfg59y]vmJa{o6ԡ8Io3*X0Vp"s̩J_%y?L7+Q Am0 !g0J+:VoC1W|~ֲ%KDNzM{vרSџrߚ/r:~m|Vޥtn gMbIDQMTG)1m:T1MeZ\a0>|*A S`^ݼl/#bY8WbbYׄ*;Wer![JT{aa'(c|q/^aqav7r5>MhNU]S3}ɱW1S`.Һ'ߙ\\glǽ!#'D_=* jam a"N~"c7XyƩg\:MLف:bǐ,ͪTwm֎=6-+*J?NfsgfU+c5a]<zw/~lEĬ਍>]+|381fal~媑}ܷW OTG|gGVN}ʳuyq2]S:mZ?U[#~vjA)Uפ 7 URYÂ2x +B`Qy_ ^{9x; {pZ2i6g4/?R#\yu}0xW58a"X As-8!#]qB˴2 Uo%))!xcDGOKf59de`Z T2"Vqi%RA:G,<1 D@_F^yג ]oP%N%#H1z$NKNwГѓCx/&Xn),EFwvuoxsn,)385. }5?+RWVXm@`ssv O>n -d/^`rnHX;~2aDMb d!Ȝ.ǴƂz=Y4凢%(J"|mX! #8Ԃkd8a/_(w{cThgZ&rU҂$c&9ِ{MX}bNu88ޜf# twC*qHkZVJzo\M$tVmflC%X F+Y;+f wwrFcdN0};al r4r^R8i8QAܟ{+=C( uqSeQyցᏸ/cT\{[J^ Pp.h6yoPM6P;ׂ"ATʚDZ稆f9Xjfk 0 89 ά8C*mB\- = ]XG ,c\Aò@-thBlMԂq԰xT0[r%N.(&L`M&JބaARV JanVKvU0)ϥAe!αX %#!.ы^.DFߖg-(mދ˧qq^|lr(l{z[A9"Z>K޾y(!L`YQkPa'W7!g?l4mwAT"&'Z}{pϘK:Kb26Apy\ JvyJ 41x{(+6l_y`Bb)Oa/Y|Z =boz|x~c u|gukA~+=P8?it1oV _^:ƃ㙵36Cj = DdxH8Y5]/K5;Y!(9YCn-ͳP6Q.Hk)٦k-KUL=|1sv $DpLUbAVfWI [̘!>HKZCzTyui%gթ~ [ ᝰÔe `݀pS07Ӧ1=tQxRҊ; @=p??k_/m1Gx,ECR&*t /1uPiVVO}tr]?M8ֶtDi;^@w;2tBbwu\a.bzM_8" G.`tpZmg[͢Uia" Yz`_Չ']?s$^@rK">8P(pŵՑhTV) ;d5~#*YoaLQ v3 :`ܹۨd^䫫HnUK7w`F{ [٧U.ϣ-M| lXE|bwj'էx{Ŕ=V,F$tgaYfZ{qD5JrJ'X s8LbZ 28Ɖ@@i5;uВUQ&HdM-?Yj%oS/_}Tٶid "LupoU%RQ +VsĚP~X;@wjGaN)tzz, 5AJ+f< GJ2`i[cH,GmzṴ! ;7*^p#Ok1̪8f3&.qF Byۙgv8ntoLJ] "UF+9bpIaVBC2D~KV#V u`!Ԑ7n0C,w(I*u5q++(W(x0|*zI Ϟ}.hU؎'z86cA Nw\XJ_-si\).H8sM!T9HRL7mMņ(4>JҮ=n@mf)wi XUi/ Xb+l"`uV+ruKLsܢN b8sΥC$8$\2@" 9<@:UHs}NYc8+#VVO\֝_oO\:RF T]&P-9!uǨDZcQ1l{hZ E}<1+Fqڦa9/ŇKK*KK\ZZVM@mxWxs)v`ujΜ\uȶf5TH b_*<٥&xЖd ˑܚzIhM`4Sf;]hT t)|z8Os|kg@N~YA@ RsR>_j''HBqY/] yjDoVto%U.n5d!pM)AUgMÒ[*UL':=ĝnU*nMXDϷ)J8 l8}oD7\m)*h55?٠7A-^v=yDao\S?x{lly whiyqeDKQʨ'2X)_ʨL/*N/ >OuJhx mQ4/-)[rKw.ܔyC>E1 ) +2Q?t_#e?<[ ˿g$fݼqa: xg^<)jEe~^XvތSՂD{0_i:W:_0\H^# 5DdÏgz#_)efw#/;m̠[]*c ߇*ISJfVh4l Ɉ/AQǟIEMtZp̃{ĸK0G ۞>n ia%,34 G=2bы_MDFY<1[!߶9.qPOm$A }A<'= Ż-ZߤOAiF#U#2>g?nk}8 Fj=d}Hmi0` W+.M<6U}i(V/Q*FvSD v;:9Il8I>-{oǓ_9]4]d.߽yZya.ʵ\8EsJ؁3Nk夿vPi%洤 Z x-4pF 8%Ab<5$#ӆ7!5W0qhCdlSr1p;'ߧi_LͿW&sOin-ۙ8v=ZiOoۨ.)[%Xnc6TĻ*F Pqƻ*t]*U>{0KLǓʄTɼP LR64"ojYZ7]YMVr߾]QCEo/_󻾺9<|8K:>BxH}/Na=w*֟}n/f5g[I/OuNJ)H}LK>#%rQ=tWo-WIMNm=\[O>HI"ocv**m4Dh8|{T\azpnV6بlh9<aI)AԠҒż;z ez+8Pah ~""1eJiJPøTxPi!V4QXbAGEsUVof'Mc@SBLR9z AOL{2 alD.MH90 #(VLKWa.6"Za%0zIQbhq z:IT- @3½K‚X[ Lm,GwJ>JH9A!v%u2F2i)AIʸZv.DEjI ^w>Ǐǿ|zhe0x"c觋+qu۠hn0oP3oO&n:[U?Oq3;Al7Υξ[F=~fױ}pl v )= *u;+7=_??mUHDI; CzGKڑΥ:HͥB,IPA=kEåQnD{(2)Rַe"柿/XWߢ l`].…MzM9Zڢ}q겱?wthоl(Jı^ Cg~]7`CjH _DTZAYѾ%OUJq, OdMf^(3B] Ks[0w|3+jK [@JU&Bi~EOu]k7F@)]mKZ#~Z<д42}b\,SJ1Vl.% ȸx/f\\\ŨŸB (9)a1M2AThGhmp*N;P(n~4s۠hTQk4` iz3,)cf{!cT6>\lOvdR/cGi}fDtuA( ]5 Tk'W2IbcŁs h[Hڡ9:+pRJ EWw[3 B1ʌ-Q>tTi̜WDB0֨h1 Fh΢3F*B>Xo +4v"o>v 28Tdc]7Yi^M2'w5}nx&at]3y]̾N'CgO翊t6DlQ. 2w9ی]:+rOzpEsĹݡ^PJۗ/~r~uFƔѯm[W JNwn(t0ֹ֭yC1E0 В ^1aFݝ^ zU8 et]?>=F,_!^70`LTnEpɸ n]\Ϣ/鯶qyU= -Y/(S=HŮZNˠQ6D801NFa%[@BkeA%OߚV͂1C/HG,-Q ߒ03výG >سtu9v[Ὶ2jSw#STF U`;KE`Lrkb^BlAOi'5[&I=rE hh4վσTİIVTIO'w"io /YF##xAj'}?G|x1*Z1{ak3؈ _!guWdԳK 1] H*Y- #J utWj)T G&5{^W}{t'cY\*9==(V| ;WK`^g.q0BP/tji bevQ%ļZL2H*0* {$qAqB $Z:E(UY7,ވh_s Ǽҷ%y|#Lrd3`e sEd ;1jȩt@~xAU_Τ\T3Twca@ t ?{6u"̚iʋ 7g^ҊE%KbԢ*(#Lq޷F;\k6޿OyU)o>ᕵ|Bn˨xoC pAJk%Xdag[ !s( qV"pMüˆ7k) \7&qcY6H+*.B%ne/UϋBJ\WE]GqBZŘ@h`EN}Q!ƥƒXчTZeV9_&JoO [G1䋍YQ$h'TCOP+tm#rp`yHW&EE+[o֔^KѠ WL3 (LH1!%ZUIF`GJHp a͂dvI*2`-$K,KU>sۼ/0JkŧMozT/_>=|# "pjG>6;'zhv'}'V觋+wu "? ߠgOߞn:[.ω>Oq3;Al7Υξ[߹Yul&[TNOaߞ\}&dSpL~3SC`PR6!%7čŁE:iKM㢐%g2|뢁U=%EuMݧc}}1B=|L1)\ @.ZIr L*#Qxtѻ w.2Vhxfnb9dKU9_[U'i#*'=n(, Q %Ҍw6rި:/@)K׭ʀf!c.EQ]$cF?G-ge_;^Sx 0OFZMYJ9Pw˲-R[Zq1þnAQO}&5W~HHV+`c>N$*1icq y_9Ci;azya?3bg?iT9sRϹ Ow7 6PkvceTu*`%s ! * XcIp\) _S~gt93zYWEg5f[]J?=~u~y`H z0Q.YEk{ +˼Mؼ3}/y®ΫΫ8();dU)"6L"Axj 4'D1b=Ɉ %$;[cFha+XW 1$~Q&) R*"AB"Wq28g:r(  2vFI0{h"F<}m(;S[*%0LX3 y0 RG3Vh}@ē @ .GAc&4xO6hIE!mZ8"o$IB/ LQyD^2>3y7<-v˒^HJdQbU آXΊ"22"3 4O-KS ^Z ŸcJIhN> WvMn(7XђitJ.ƥ2(xR2_i5%WVqP#%JjC m6LI]bE~3 վ{sѲeP+1C#I Ť? !#+-TʟO2T^iҖxx-~8;Ĉ}>Z|8H.=ygY̙*fqp 2aۼtt8ZW)~_3/71ѭ] ښ䫞Vm1_Yx@E>V.8p&ns@U)TP6ܴJL&GغJ^ NM;c&h1?cۤ܆`VW1FR+RzT:s\ N[& L,'%Y;{Y=|Ǫ(ycUr>V7U )>`6˫f1zjB\:qپ]u5j;o4͈9)wo{ .(=-mgxn{D냹Dss>Ճa"ö$r8?gT,hH"ULum eλe朊,ÉI=Wss;h.2WVh k&.J`R+h 27VRLc/ $ EҰ}w4|o$'P.FG%m Hn4@) 4jصߚ=o7a÷v$s'iEg [kȞ|mu@ꈟ10\vG!4( .*j̇VE;`i(f2̴&zyUc6 +w4;49 #vd`ͮ $=9yjw3Z[{YRAi1ǵcOUnmf ]fWC.s @|i5|Wq%#%(K( "۴Z8＀Ĺ!:E㼔"hIY0D;0J_*9nG.l>VNKk%R1+ $*YsR*;~I̋vg7_ qSNΠyQ.9-ͬqƬƠF?I̩<+oWQV1IU(?=o}zx+%CiI;Z zad|K5kC(V_4nL!VRIwmk-ؾkdPXCk Q 8sDI-0e%ϐ 27R*NѪBIkGL^yV2ybuBi' * 8 +'HgEuh1YUf\x[J/V?&^wXj ;;!E#P m+TFJ6B'yfQiNPELF'Z ˧@&k6['f41O#S'^A;mwyc/Pb^FٶmPR4hsIYAO\ UD!"k@IIuQ9NHQ0@ö[i(H5wl[) ^\ޱrL`rUi2I"a D;1iΈ(mؖ g 3\҆$}h(V x A5hDqs15mG^`Zd 6YCF<|M1Aޯ ' eo8Oo3i"?2+ L 4B@r$u{ȍ-?TwWȩDM JHq* !$NC% X. A$h@Gx7 4i?;mM鼩Iw;Fލ`[޵HNYPw!prOۡ qrb1bHY4hThA"*'!V7Kjouk W'aX.h K,i^z#3rv tдɴtw F{=&+)% VR5"!HoDs9F̵؝-1j=vgKƨݖ]Mr\G?ń`˫W3enoE?Stq)!R5j(RslI_GT;0KƨM˻L}:!" N[ Tù ح.^ղdV..uK1Vcsz,fw ΝxPߒH 8^d8 ہ@Ƅ+|R(9P<B YIp/C*_6@7BL ; >`(H. ۶)g8'%%Xn"Z` KA6ZBVknENyT6[ gvLI'T1j3Emq!("'hS$.ro8cI!<$H|Aj,'md3)i)8lE`t}%DIo3N=UqkE}xxRI+ALCtGN{$?Nmh?]t"Y4>ûIQKdP8EDnu Z{JYRd,fk4Pu74\OҼڸj9?Q k*m9рsNRJnUύw4h%7b8:TL4g_CBԄ _6ழ$=Χ/42KV*(6VE 4dT d BhS2@0]P:f-g9uVY& XznN%a6' bXKy6X>``4;P[F^ZM)Zz + O"qlFXR F!Kt[ȦL qKɬI'jn곑9::͟2|l_Қ+rgS|~םѮ/^W`Se>m+cEFDs!_z{D>uo4U>N2€wQF֩{h[1ۏY~3VWlZ6V37?bڭvڴi˷GkW(4l{FK;۟ݛ h#~(wf;˩*2^rusgA?^X^Ї}cjAz< t¥N&:_M}"Nx;rS}qrI"r?_!BxUav8V|Qs…%k]^މp'}q\dgM#YYWI\wd 5-B[a E{/7%;B}U,K:maE|sV ; l9Ogc2btVRfecCo EWB%K6OɴTv6O2v*w{oroLJ <3Ͼer>j"Y D!sDHF9dzڢg^H\KxɛC rκ;ön<1 Au_wgaKn8W|uW|=b_~Y/{t! <%=p,1|=# ӭ%M_9K:P(ccjglT~e5ic$xj-.iv!s r@' zꮭη㧲p;>Eűpn*Z?h=7Xl[8&3! Ǐ0.dڄU3|5%ԼVǯpukfeV|ǯ]t놰PseK [98㾺/KLyiIpq`qa:41k|Ǥ`ǟ)7_.]|8TbC:)4+cq7•gźJPh4 L%Uvd/%t9B>gs/ Z+VOZj`mSjXV{4_$ 4Mtd!ĮFW wexZ|[X 8P]u5P.S*Cd2b+U. ]PL6kɠMd4Ԍ%$-K9SFKISmRHiّZW!,$z }R-' 8 <ܷ3:P qTCJfk)OH`ؑ #X/8Ej@ đHmܐa 5^+!7i{]RwÞ8xVf$я(zeYRlI{ڄe>+յXB=d+ );$.'т드hVu C΋|10޿q ~e޽~F?[ӷ]_߶O!৞j߾G:H/6y}1t,Ok^Iӛ־7wu3;?In/ۻ>o2̮LAS )y? f1X{y?#_;3C"W_i:7R3讃΍fXYYo Og.34dqFre:Hk9h1)5 I76xC =/Xbl jyΦBc)RgP C!j,},ߙRaE8;eg>EZcQ9CWRKc,ݠ3f1Dx,u2 j󑔧R4c,E;:Xz i1z1CbR mN\y_~6K"U}g*X:j'R1ܘx9LoD ~xG~}[~~iЙ5[YNЅ zVzfkÊnj?Y{&t|j7ob+1HAj'|aM 7VGZꉝ'pI&!ÄQdjIPΏp~vn+O<_\}F<06(R#RcJ`QSh/"G$nҾ F 7Q3wozԭ۝8uDb YW_FcL-]Ψ@,,/n$O3Z͹/n. X-U̵+Nl*p_R9=[F2`&l0g2tAֈkUtr/B`k*#vi/D?Uّ)qn{\j5rCy)'GɎ'[tFDJTSXhxGZXvr*b3rANh]@H ;/w>.B%-6i)%hWj1@z󏶰D,.6TSW+XJr l1#iaΓpR6 8[SJd9xL;?bgV\`kÉHb/6Wvkp,W]!\$cF\/Z`,t &V N ?¡a~JvjTk6L!tLVgIsGԨZ8y?Xpc ܽyt@EgX^'C6VgZ|Żbdm4QkJR*|g֟tt@vY<;N_4x`Xc<G:k &)SjLCJ΀Y#%H }o]VlX1jFMk"%#HT8a.j"&+3P#ZUt#ɇr;0S V%sIM`r^ޚwl-.ˡ>̲Ö<_eL<;OZȾ/a-O521`nXm>kZNNz&+wRȖzdnI=?3;o/N&hQu;?wQgɧm P:ؠ Qjr1KAYrHEec ]%(u}U=)a:M8u4L"n "ImsEJjMDiM֎rVXC k 7_\:Pi"֎ͽ|v<[@Xӻ`Oi@Ȗ<>vjOȲam i$RmHa:7o5:r؎(h|~3 P_ijxdL2w<OrfPr9?`1WQ$;b,^WzFo衴0Y9|B3䷾N:q%?S6Hs5RM^^G5qm­cyZ^{@P%jدoGk4!lHZNFy3?(p C_~p#pUjg.J4hi4;Ӎ.5:>v e]VX6̀j& Ul1|) Bb4B;hM̨҆k+cqU3R#Mz x-h4A5uLjR9&QC#L]XC[Q`,pJyg\nwWd\Ir:H0X+/RhIP#Ƙ52h, YOLdBgKE Cmxb4kb'?ZGQǒ7Vt :TPݡz@ͨr;saTŨTwGsI58/FlNN&aBqjM`2e%iLѾ86":MVX'mͷ6jlolodҔ٦Z>'ߞBrIMup}l<:8L%'O8vߒ/_1'|OO;_p?iYF–LOmlLhۿ ޏ.znSn'ПG''#>˾ؐiv>[KW?z9cY:njl)qr˃̱ޏ׸B בMdƱb_)I_P'WyO!Ͽ| .TSu U9<6d ՚Pta3&AR2'&buڶd D?0/Fn1,љLe4ìüK'o/E嶄 E\W@f_j-AZ rs[\!YS5sV$jCznmyD9eYNb&!`1K[ftdp}Gm^u|mjV ).*9kற@ -" 6;e80&Ds{Q_@2Q kJ#ںVU3wR܋b$G;g++9yhc M`dPeuc;AInd=ӇDhC- ;?6a!6Or2,NY6r|dEېwvraC3.+ y'N/ȞեQ -AǦІwYa;ݬ͌wp$tvs\u=5i‚O)&!מ~fֻH*5)ݡf+ 5DL[ iU*@,nT,sLPQM)y 2PTI`C@.$."s4#%/Q*Զ#Ԗj]3aR9!+,+ϾF+jfS0` ʈY&*M65+2ڵ3,uH:.:eAaHga%Լ" boD /ԶzB{jLjc )D筎3WѨj}6E|=Vp6 ZdXg>y!"YASd*@Y2P\GDV+RK_^l ,W~9z֜>kvC!ą8Oڋ@~;I ȏwl!ѷy1Ϲ$M'X)kPVQ/-`P…!8̯q|gqt 3ԋgo>x?W /s6_VM7t-{As2}Y<|0&^_=$O7 ϴ!QcYwn  6\[G~>oE}lrI-|&ʺ +1{'H٢ggux89Z݈P9%ݍG 6<ɰ)ܞ($6cWa$v 4}6>ŴQ*?9=IF'iήk |2f^lm4+]{Eߕq 6 6Ȑm;4E(l-V68A'ܠ8Y ɻ y.;h3ri<ZZݓpL= f(+n)$p tucS(V*, (Ori+`slċ:"N /uH;a U+C"ht%{CkSW\¼lΑ=aUӪH"n^b.$\H wJS{c>8$kŨvιd[gjCsLEf L"JZemLq^׷R/4jR4i080:/*#DJ$Ez%Щc)N# 'NZ#/Q'm:Q{/"><^;"Aol(ڌ_>H aKIkRz6wR~8+6lO=O~,3zj\ o(؟|hş_L<~e0UǂQ%ؕ_5opj+XQ;ȂP& nHvwa7(,<|k[κB40 n4I oK]>U){/J˗}w-듞؆&TG/ y& EI{WzkU)9(Jm2;$ſҮ((QaQB&lCn5ӣ3Ȏ{ZwV;#:*fEVGJIC|ʘ:EnyMRs1j%Zzd0 3p׾qFUWit* xڶ(|TtI@26*Qd(Ha05-%"]wcKofiÙ.d8v׼}h*_5N37L&H& ,#o  JZ6rk/X]р!:mbd.W7TMm($~<|Z]8˳wq2w"+bS-^~w:Ce:7/>|`[Yyu0/sΗ~s^iKȕng?>`2;/ y;"ϴYD]|zO :k`w]ʩ1#8+Id !F%8Lx#H,zNd`X5,Ju_tޛըc 3]yM`+4.kQTnq6l.;OJXZX\ʌSjW˨5,v윢jkKXd)آ wA_9c\wN 6DY޵q$З_|][,nd7AЯТVu_5II#)[6P̨r-3CD]$4;SCIeZCITRTrIt nTC!6 GoB<6vg CyPϋK ћ[?Hޏ՜K~ӻY"Cnysg߾XN?\DBN!3-pJ(2+ﭨVSӌ֒9)~ qƫk #kDe šZS %5.OHR ..&v~xhjG&9( ׽ 0# zWuyITu<2DSC\hpwӠIqwhqS3iGok:x;_Xؐ ε)nd@s]=EfF4(ľ,Ҹs^6{Ӡ'W.2tE`X3et]W7;lhVsb,:̏SE좃7(/(KO6]C;AJx0&%wWufPaju@ GV?_PAϯ'mxLIpQ)S\ Up Rnǟt/rk}J@{&5Ƙ,NL&T&8ڄF9Nt nc uqoVi<5T>I I 7gxΗ>Ow3ݫL: T97y8S0{O 3heụMsPf.C.@?(.Hy20mg=ʾ_2!Ga5.9$Oȯ(p89q ts쨖-H&^lj7l<|0ZaGKU) }KY|,ts) u#_WK嗐9YZs}Xד٢ 5ҿMWtTvv{Os@v6B/cc,H5{Q0{tᄮkR]Ӻ k%5c푺͟$A'СYt–&u|ߢ};tQ]G&9,.כBz4FKfx =p'{-3>vᏣw;'[H w!ZZ]Vm~w)n&cDҭDPݵI؛jAcխ{O5J#"J<9L JL`U":8 Lj QB2^[-xM jZXÊ \Z6ʂ/4o'˪C~ooqyC7{8qg?=]ڼk̓wg?]V3;!i0WMg'cULW[m<0oϏrk9߽]ga=Eq"?ё|*J5^n)X]RY[2p_YkOz>d SbVsU;f5NȨ2pJ蚳s )6Tsz4j uG`UB۹)Y\sNrWuWμFHoA8k["dYkjQFڻ\ѸY Ewtd9=M2j0dvdBCrmSNG[7(eѺA}Gu;^ܥ%KHmZO4׺u!_)IN>\n7zѺA}Gu;.eڸ֭Ds[::FiwsF؂KgSaj8H}? qZcDjēIJ+Gi0Lw9g&c-jZPC!h܀_!L<]mΗ?Rv+4hTWbp]IoXO -R{iۓ'k?>"]JK=|hߝ]f7 [Ԑ_zQίϮI #;#6w1^5+7 ߴ`F ؐȲ]yTM"i*ڨXʁpIk΍#?s␢iʂ]6! o6=0u3SOR@ Ӈ9<,NF0u [^ mH)e/ZAB<19#l,XJj=Kg)Uj=DpI-Lse5Zgq%ֺOeN ᢩ9íbJQZP yHٙ's辁{ ב Rq%9T1ZՑ_<9LjM-2.ɨ'v?88IUW؍Fe9Mo6/Elh6/?>qZ/KIѲ(Gb~\4Jy5˂̷# gE2SrH .֑46Vvi4\a0l5, LCj6yYaC%:ʀ.d,+,}C)JjU ʸÕ(#2~K^' Y ƹ1D!5lzG_B\H~ r|@K@'[lA׀<,b29p,,| AG܉ZL.6]lWscdR/pyROd[$$fFS*ZJA,w$Z8F£ w97Uf3ʌoFqN'k8=FfC>+vkxrj[/˧ÉxZ:4 |pc~6Hv0m4Oy-ZwiQ{>:Ё2aԁփ!D\$"P!j)\Hf@ֶ6KA5eTev>՗kTCj_$KKsCt7՗T30Z<-e A '\5jR.򴔋dK4'\m)}T@R}"t՗TsUK_%sIK'y%ՌtKRy'X Ož]:T_RͥZ<-e2Y%$t՗Tf#W-}Z*H SK՚jRyZ tvO\Ix͗d- eKsCIr{r ?߾}ĪGd<~VNjִbF Tw_.{3r/}Җx{0d28.+w ,1ݲ$3 ZҚ%E1 uCP4^E)ӜsTRΔ eIF,1 +XRkK6jqJb oOv ҡ=A|>vJ$6:NW#N]uiShW6 W >N^]*/ `|,Ji;aQc⻱'IqF{¤ٻҞ0a仴'V8us b K3Ly`L[cwDxkl cgGWkHժ&---Pc8q&UB4APogC{4ƨ7yl!K6Ʈ:cj-&*C)n0"3-ܔ]ϷMe )smXٱ0_bouZj5uyRHg$=N<~%9Joxow=!fgjeWCP 89)K9Vط4g[6G|mo84 cRs=i+u&Aٖt:rlm)\[Y魱 %ݞvURua0uAޜ4ݶŵ|Cw.DbrPىRT]H,I5w͋=4s "sgsF"LlSKZs +9H%ulXv!mv:hAJ"ge@U^ 0 LŒİ/w7M^:ʫRљU֣SjT%}n fM_j)^CQ^^#WW»4,+tΦ.Yjʿ! dy߹Sh#`qYLO#1Y9v}>WM(ChNdM%)߸WkJ5%Y5R/MO$,#8ѣa,}pGt'*AYx?>wDѻIQ|.$zk$޹[ kg贽UBͤ0u{ ϿIRLi(=)} eA; RqjjGNRz|R*o(iAm,qKiڝO9)M͡OnRFҤU%k=)Ii 5CpңҴ6ZjHZ.Mk5|')=F)MkBǕm8=)MkB<Oӧ;1KJlʆTHuRFZ%IJZJh!"imPORzRT)pi6R܂\  SP !YP\+j.4GxZIJ̅aroF8VRc g (Z@T9w Q1$%@ph!`9V@x!iq5(EB}PsqcGDP? P_-CA>,)J:H'F⸾Zw1h!m܎oC,2MCxm}Deldֺp,.oH0d%"syvG=+g}̆J#gvE^D^0C*m&a:ɼwOܔJ{ezϓO B0rߌ*O"IOH%P|@!{$(LKb#=C+W*~fsϟP2Jt+7O~#r38@_$^#fz3oa0{I\I @$HQkG0ʹ x:+Iv,`.~4S6W킷갳1–mqcfݕ|_/(;$ǻ]= ׷/=?dǃ=c@GWXGZV֖ܲHEI_~ObԚ5^Ꮊ^)+V ? m?Y QiAknj}T*e:jQpO8%jOzOh%]& DH0{K r fKǔ08C!uTsOs!UgEFpTNZJr' PѦu5cPe 40^bgR\8F5 (Jk (`9ʄk[7&(%iWkj#A֣)]<~ξ8 y;+G]dtc4JՇΆeԪcgK`D0Me( Q3*SX7kr𠘤` 8:_HԽBQg0!4Z#Vi4#((Qi<4ՎMG)Dպ0y΃9"\I(-99ANε!#m6?I$>'W$!^J(NsNNi, QT) SLV6Ay4jIbz1'by>}(I3Ij2g%3"y4胠4(r@S|Jת8ٖsNGSˢ9 HDA?j'2jMzjA(ooQFUy?WљvARd8@12&;CQ.?r>nz %Zq YJNڛ‡KE5Rl.Az +bEV/#)kEtf8Lg#aZM?ڢo ,¼JT)4\@5CCQ-P~3^COgw٨~ )Mp}n>3& $.}j X  IΤ76h~a)^S]J. ߘJN^Ƿ&7&PLoR &8l~T*4KkMݟi"{("3&eqI4jRAb1QF' X.g_RhJ+$P譕`w%=XLo m(%N叄^'RQ1y˘ǏNva ՛CԖHo8@8rJx{\ j甉S/ID 鰁#9k8zF)GKbsn#cLv8AK1WGuav>t(m-XuA$=`Wm[ԝ4S"7nw6>Ѽv:L]tb`˂VkA* iUJ^uJ$w<s /F ȉՀ o[oD)vRFBJLIZ;'=E"AqJ:U0R %u2p `hMׄ:¬f$EO-ŐڒrY gƀ]NDEy!%w#sV#m!1:'M#9juR8K-h F8n,ie<@l% ps{Ic{ k()iCEAٳt]*{Ԉկbu >]RŖyh VXQ 2g =E>A {G>( b+Fr"| $0#|ɯb北$I% K'pGcVkZBl'#&1ϯ }.bRaOL桢/tT6^$iz%THa*Ɠesn1XWB^?&&PW䚃ovշW[)EbBMvFQB>܋e^破W˙6'2tp'rA&>}o[]Bm%,Q=MM+VG:"Rl՚n9eMxE״~ipq0~?g p#7"D6Pk9zι8zOn9*ߒeXq;2{ؑ:FŞ2,>@>TY{|^>c˽gMZDىM*T{"݌8Z3A_RMiXOnԜe F]&L[W9jTDٱ^A%]G}ݛ$@fRѺ`P"QCe&%2W`cmv3T~7|`V зhM. ?/}dk/#-&zb[4mAbM <7$f:]xn U3s#%y9ZdzA$T-gvST4ZF?w2N\NS!Z< "sΕhM wmmE[~ȑ:MRx!vpUL8O@Q DR3(ck+d>+TD lnq16azy>7E݈ m{ Qbﭷsz jۂ KZǕޥT@9覫zeWQ]fj'g^Cs9F}Fq*+M>TދF/@O HGM)88ᡊce*@ lK1KH8 3]0^JezG Z W&Z)eBv, q(!( !讚/X= V:/RU<$$8cLPub998LR?eѓ Lz=Oc"E@r2[L"0V'%jJ56Mm( G\sɱm0vDX"z> ZrŨS9r{)!HJj- vPcQ;G⚾ƺ+hu^лX4b +u#SE12J?~;wwx5Rb=Q劰!Uja{úsS}BzUka&J'NW֏&֕ӝ[q UXپEcR媃 wQXZ7rKtTjh4M`}',)sA{v6qTTm4 %F/@ 8Լ񶼟/F(K1K#U{q%Rvpkv$x ՗@YUjK 9׸-vdds}(LWg˻3 dM&/ߚp9*ѭctՁ~%{6w]t.nMx^>E3'эֵKtsI Ojݚ/EkdXӆrNuڼ@W.zDB8vfėTP%.|F|ˢo +tgϰ0KBK^As_l.%}J>T9J[ n T`.NfIFTS5Pɨ0 .%^Ѱg ;guku cZ8O]Ѣ -ޛBE;+vtzp}..܊hhXhTe[4kдޚrH"gYyw󼥲rIPeuZR"?jӄn ,Н~URx͚UӂR KnR՗ZtJcTaN bM {VKw_*ԖpMieݪ~2 .Ӹ%60/{yRYApج>Ogj;ޞ~-Ht&pD}Yf.9"OX E3l>l<∹ulg=8۵75G/{_C^}8eDք?CUFWٵ6H :Il;'TI4zVҵ襧2؛H2/=&*5^z^ s猗2:Xz\YUjE;m/%K 1+װzV2G/=i/#=1+>cRϞHK%2idxXaLyi.5^z^ʵr] ԳD$ꔽTXsS5z9N7_iz7ϥRn4 㔊ۉ>77X)FONwSR.uWd~n>Mok) pdr㛟$\n?} bY,U/pڇQaID&Ed{^&&x{LRm$ je7ZiMeߛtm)RtاCƭ*T",%؀IJ%Q8E,BDL08BHIUGhlv{Rt\O6!ѧß6 [5jQDFfA,O)#\e03JTԺ8RHgQQV⣅ᷛd n4[!sav6O]L/aQ ۿYx?_^e^vNL0b%˰I$qbũ@ Ps$eT`bEJe$ZK5Xื;M{v#jC:e`BbP)1Z0ӘEiLGb/ u` 0AA )GT*[`2W.C863(*{7Ooa1*Vë^aH,B\| 0^4onM=ݫ3 ڿ!s3)zcs}'_\ǿ=}7RE%<\ĩ2JW^፜M]Gxؾ!3S> ZRJ'@KԻLSЇmfLOCB<.zD8lf ]P݄WwH&wƘ 3;#ma Zi ;y 2e+=ټ[qaN 6-OJ;gw2.݅UWm&rLlךsS=Jӳz#SGk >bĚxs+S'D$ԖYjK7QZ퐼} rtTCY7lU!7i.ؚrƬ԰mm J'|59A/@M186D9NȆcXM+RSvv /㘣vt[cY{VՖZ )tKfUGK 扈ܥ4dds"?1gېm)]l&uz5~DHV1/I LrR*_ެew'($70/O*D`)-e{+Ub~γi97SZ@DDkCp)FR?ne>\Nѭ l).yKF< wQ/*b5F7H +Sto6庻݉c/E+>eٻ`88ӥ8f/*`qyﹿ,w6GG'ȱ4.\uv[N& YԔT[Xr?H9SRm8j{#pt\YUj\P;/%4{j^zLYUj"Ÿ{:m/;KZN9f@k ZYLI)p J $2LL&*QTGYq)O caGX.F:na]bLݥ To4`x.x-LU@4Ta\"-fOFKm"Zow<'&Mr(0"݇Q|, ,{(ݍM,hΛknc?0,e;E'Ҡ׬@Ի{^7<-,-8`V9;?{2"'BOi)(#ήdgHI%jĨ2j`ouE^[Rmxwq lJU0 TdZ Hw?}U^ +{) `9Ynr|ZZxl|)#VױV0fV<nIt8aB1N6g&EKfbS6%Иvؘ`+H#_s5;FflCBef15f.I.i.$v % &B|o-h6Wt732dhϑ* m _yzg_Ͼ2}U "H"h&2D!#+,MH&2!pEIɒ(NC'5)bRܖ]"]L6> B.ދqlO*ZMw`KS 4 M̪ЈqJcg!YsG<űX%*Nbx W&R)B U $Kl>?+җ'nn`^>li~x4vGeC.h'>*=w5~[tNb[)|wh ]]F+z ]珑mG Y4,M+ͶTf1O6m3%ywoW]kʷ͸_&(,4RD4Niv@]Ԫ&dj]f,)Cj$$z=pAePx Qz'7y_VGN<:;ޝsӯsAPW͑32-,50f*&1W$ XijQЄD)WiGy>iQv'j*fgj, 3%Ss,$I9W $2MGKiw;CJfgtEXOu$qiE$EʘS 9+'V:y6=IAJEџZh}| Qe58ڭ~ +VÜ,AMWŎpӬrqɰ*A5kq}о G/TRd9Ti @ր@mGeJr1^RuYRʺ) neIь{SpF"i]{!ƃMBA'¢b.)^v<Ḧ́%De($Y.jfWQ9ь̪RRd&`8Y0JjNPvһbp Ui3%Jj:ϋjb1BV,qWZR %"w.Ǫu|0Z2&2C<Xm &:j?_jnu䟍ӒSUv–}EdYCLf |"sa(T)* AgTRhir-9 %V،j 05g ];!gw<"FrgIxER T32 CyzGqɴR .VRE/,/i @3Upݘ=ׯ^ؿ?ֲ&zR<, R)/>00{%|3U b?G P];)<|OgΗ+s|xvewĄ>9G \MO+~w8L_{NnWlbupL(-5#-7|Gn$*5<0GZ[{}]8&/V;[~qpf׋?8QP|K:<0|F楺_Qd>bIW"\o[ nɨp&ſ+½gvf'5䕱s$钅fBk!efskZLZ$U+^;<"y.~_kÍsKӯU=e v5=.9eZC5 Py\|Kh<Ç7q=l97~7ESnkz ~}m]}fW޲w-{~dO#~i?Y`;C>xޭ]NvFY.~c|E+AηI 9]zϛp`V( U'F*gnZ/w\R ˇ]wD*[nHL{{AaO/JZm{Q2]F)ƇVw8(6Sp{#R}SZKA0Fx.7Vye|U+U|XN@3K6VDb򟷋?ULwsp], $ {U# I"UbA7˾ӺR_c(퇰$Mv|I=bXL 7;"xwǓcpx띄R$GW 8UT)dLN5RJ"WMw)) ZRgK GdxoYv<5*4#dHP{ebx#UM By; ?(pqO5!bW'Q#3uf;g2C\.2A-nؼ/J(\e_Lxp bck[T[_NQ7޶jcTr|i#_s xajEtqOxҜ Ac8<_X5#pyuP.F'q؜P"\V<\$SD {M殜r(MX"gDpR;RBEsg"JN:F+(t^+! tmMnGylR+ ;Mq2B^-vZNiڹZTChٽr-;K-2xٔZ1 !1#yUo &IIG@h (n֏;wKAtRfMͻz.!)T87K Rugx!gZKћwKݺ@:h`[8%31|>;legL U;]xvDJֵi;QۉmLZCNŨ}4\'eCVJD\8Zul#ӽNy&Ln}w&}{4I!l1 Ѽ%GR;o~Aԃ^2; 5O 'Akrމ._C\MG:Ý o/R(T& 28z<3FgA"'ʍE)eQx+TvSM@25rs?PN=R)ai~΄'8hjEz#~Hp:#, F"5>Hcc>F<&E=b RICֶV–WRTب8.eQP/V3B5kF?.8|TŽX 0!^_#TR %ⵏ\x˫{v1:"ċlN(?+/~ ~) :$[ۘ|09C`)OJ>UR%H r*B V>@m RKp&n'uryNaH^R  В(%PBɈѿMJO%U!dψSΧz;ͧt7LMл :ψn^Φރݺ@hj;Aƚ oz7͝n:N3bۀ7g лu7֘T(; [uDžr\xHӆK^.(%i&'."mQ;I"_! JRnm QOQªnj94ޖ˂Gz&s "wFJki ;T) U(]Fe&JtT]I/v7k.]_†^>Kl?:k4*j :)6Gj:$Yq*[VJ[@opO0Kv@ +Ɔ9ed ߍe Y3ԥ%N6s)rYIf,W$Ľ6TnZWZ49Jfe)IaM2ʌf . DZT$TȸBvr;VuSm &X;`G)YZn/X9'G>RɊ؏N6U!i#\EIϷi%f0WQoe-.mJ /~YB$^d*a2Y*tVM=Pz>m׏ie͟Wy-3\~CO9od;/aOd="X)t6Z{%AY:Ū@Jxuf.jTі]ɋ} R{⡖gE*)œ>j㯛)Ɉ&@>/7TPz(ERbܐ('eSj5J@JufP_˦Լ(rB1 RQZlJ B"q4_WGRNPZI">Pz(@_ /E!GRԜxTN(=F5ãQ0VR3Q$ HU}@>/RszKRq5mԸsvȾ)(=_kkmm}T懹UǣRBxUT#_5U~_L wkwF׭K5n_Oig[)\b %Sm$#Ek[wxn{ V9:W7ƶ^Jͺ\+zYtyWu`Eu%5T8 ASR=p{ǻB& wYH#{q}6M㴼gLL}L}L}L}4v>J(%\YP0aJRu]6q,RbZXnmo7?.Nȟ'1\P^d`/7O+Oզe<; [8@ ؉AL8Cf(#M-XR"AHѲDj3{!cŘl. 3 COíؑByՙ>G(ڰϳo_ܧ,73svKk5zqg u8dnvvuސh>Ӄp~$\AR>L1և Ok6|5UnwM<)81DevS5p4*l58{ E `c|1[B6~~G_I kt 'q;j8ij v¢+#=moHC VvȄpUVɺED ق1JЍT=Bew|PLqB F-=YQ*5eH#U%s$pJiԵ L]mcy]=@GD#y`1꬀6YœO8%j;C!nQ+,`T0P` b^[<ޝ(\܁qy.W5m߽ /$稵Tػw??E%CrIZy|~&ajJY|#7+|{v6wG:I/=9о q8M7`}R\c`'r} E!r]c,)sND0,j|ǿtΗZ&J.ʬ^H6}^,*M_hzM#Om%6!PkcBNHԖLx xdA Zk4k(ӎT\gE§JgU+w5X~H(r+kڇ꾔N(]4!giBߺ3ag">g E4Av: MR˙v\S*!gw^\d&|#b:<:@#;@ O3=0?Hl{3Dʧ*ĂNmUAnd>AG=#{e+\wdT}EZ$l1Sm^0~&3Lfk.;M/Ow~75qۥ' URO@2w}8w_ˁ\4h#|cP olZΕx<_!YT/vuGM5_~ǫ. d+d#g%qULƷ.&ߟ~],~Ou|?c-A#aFY8-<ԆBBI/G䀠T]\{wӽf t|FOyt[6=a9ͮ l.^A޷ûyJ"ܤ}=ƍJg~^1h}IC̓oN5reCNeu+^ߌf.6md e.-mAwާe]'m-|uS2cu0#y.SFUmG7h-FJt.S6zn"[]4˧W ([.16܊)-vFF.<䅻OHb^d)k;}}V=WէmՃpĕXven>9:lgxurfUdU}|y|=\c ]*CcI֥T]jÇ gi'V{'v\=YK5 '0Qwa$Q0H٥lF㤋.(T݉:Rm#˝=i_dbh:<8JIL:ODdlt]PJ$ *ЌϹjVErPvY( %\u$7%Z))-yAu*k2(/E Qpi**VTehʷXaD[rtS"TUP-# 9(L)<`T:ݎWUhj,r9dMjEQ:]fMY; <@43/fZBkyXP FtL@y!/89eZ㌟XP&i#$1c4֒Z#~5Uk?qotŝ 1AJvS'|@Y,sb}ȜJxDkH!*N"q̉$H7Rsb$H4j^(J)lj$ĥs,K'c~' l:rG].4C(Eɜt|#i8)0"Eo Qd !nؘ8/߈8ENlX +J87)K,V2N0]1Z  5baւF9!d鏭,-$S&MS+{LLۻ|4[ޭ E л3}rN#QKH{CP9BgI7'b?T(shA8JJ(-ѹ) A({Bi]F%(S[%dnGmV3EGS JIgoAQ+%(qH' הI9Њ(GeM%QŅi~ Զ?nXCȻVSҠ%%bop 7-x롹[H.<䅻hO)T.薋A~#% wXE#[]4ǧ4a8k.ZnĸN7Rېޢ[vGc[]OȦ j7{q{żakT+)Sk,nMEjx諥tc*nM53Bhk g2ɌfFGg|kz kqq%*G3Wt]yw3*M_?Y,ϊ*Ҍ2E%@N@5N// T#\Uæw=yWxy2nG_-8VR\wJ\~kO77W'ǔKflrc&OipwɼoYzZ^?޴Y{kt=;v 蝐=d/|"Mut*,S,K?V[;H!MP֋\6Q0 + H[kDH,86aaD%?"NHLx("zW۝}hI~Bs$g$agOhЌ@ Bftu@7:9gGީaσ_O q|=qvZK#=NgGٱ|wW݌N8;,h9 џّf?CA W= gGۄ% +8gTPqq0žy\'IJxv[ nIisƮGOd*n} yj)QB$oh 79Hr5LH gښ_QevgFj\]vfSd %퉻N'[ j%)٩/sxp|pvWD7I] rmrSq;ǔn} Cvm?~atӔ!uAt>Mt;,"ѭB^8DwƔRH{?i6syCwV4;g;4cYf0 ~l|{2GR}FN|{$ *H $ Qr>_|a{7vKcTQ+QN4AW(6 _kcW.u}?lSj-9mWWiO,EpAQA]>(38֊!;"?:R]jOa[uE;~+vZ@%l9u(XYh-rK8es1 8PI!pD1θrZiF[4xB#6A< /[#Td"H 0Lb=?'Y>u3NݭO.+?]l"iBnF|Ky/ࢩZ8w:%tK Б*ξvWZ]}٫ڸĤbOӟ #RfnѨJ ;(c&i(>Q#m iS_!SkL)?'b5q6bE;mxqGY3U X /fAƥ|te"~8"rbVswUBiF7 =h+Q;3!6쉭w̓{5$spoS;gaq':@:G_rm,?u֮j8;TϨ`wak"8 y %$c4Q )=2z S2k8PS !yxV烠vGakˌ2蜎JIʀw(t2xfMjy\I,y*1o})=\/3=>jGgL|Oy/*||{2r:njF>JZ5:^>ASOi3MKO Sź/ֺ֜&-r=ϫI)4îjncpX|ys:kBxVgPY>қo}!ESKF)d 1yjuPMI<ҔnP= R_4FZ#CQz(ER$3j(ERjq,}8mR l(e#~%(噱J€ԸHY\ˉZוٮC&q Ǹ'zaNgL!UؓtȔ-7 ƊYNJ GqSFFS /qg Acp7}rl(vOu]l8 @.8,,h}0@ @ ʣ*@2Nfdz ΧK1[>Z(-{Y}eU?y>..Ojյߔ[,?k2d3Ak<<i /l߭!.р[nIGp2P"zrfLQH󎫬fb, 랑:B;*NS-ͻ+wIc}47mq矖DWO]5հJ|׷_nKђ^}~_}V rBz7s_~?ܼ>bJ>&?|& %ܕL?яiO: ev?HOJIr6^Zۻ㽒b-SK-`D2gU+7bލ7E'ץRRŇp*q/RS,$ Utr쾇+w~x/E/h9aJ2ئ=^r@oj;{V")o:~TCg+Uty:-G.Sȵ.а:˴:BQ Nx>7w?[^M.f{3ІhrD7p~xDHnk+Fe5JcW?$}uY>ݷ({V_6;dM\E :i^#xg$aٛ$uwnOkmZ O:&`[Zl(<[8ֆn6e:~wc[ٻK}V[ȫmooF;@,JSxѐq#fe"68ެ7;rE~ֶc:+G76}Z'n\/NuLnm^]W-R.S܌)UWڍ̀|5YmJybJtLTM>d\zYu9VL}~qhW9r]ͮgj'pvl#FIѭ ѲS-:WLF!Sq>nuAt>Mt;,QFݺˌn} CtwLx+Qӫ Xʛ}}Zx]^8I{G-KnWejWiI!kI !Hbe `H^h/fPƂS,XFC)֤PF8&JPZJjD鉣e`ez(e?XlE?NxNFt]<b湾t Ak[Y^ʡS@n!8WD ^ȂH)3T* 9KfV]Q3*P@}\z\ҋq'@.Ve,])ԖWqN.m[%%P+;[r_ ) 1ߓ~m\~O,d[Mk JA}k.O:VLbʈ$1b=Ǹ6ۭ~%Or.sܲ$:DP<1OK2lyi;Q=9 XHD>F\ Q%FtneE#N{&P78 e33TQ'26t0M.0[jH j8x m1akl|lr )Φ3=) hHIħZ{I+BMAlڨ :IEm(3m(gh@%jA)g05 AF-r9O+*5ObfLnFc6߭/.C|j6Iy9h o; fg@zTI%j*rgQPΠ("&8o"|B)9bH ^w%n&VpӠ)!sbcA{9%x$n3Vد#)vO8vf'7OɆF h@je%_+HlL-Q~ŻSC0]"`Rs}obܦni+Zn2[Y0вr+RSPF3!q RyݪJf36 gɼnQ1Ć V(ͻ15L 4̂ L TVAF+X|{}\nEdb:Hn3]^|m y*S@qZ7j4u FurߑbE(7nCh;W=HoLyqg/ƌЙ 4*-zԷ.͍ܦ J#oWEN[fJ{(;@_Y ^\믉[֜}ןMfcC,J )AҾK(p(K$Jgʰ32yͫ-/,t[%oZ(;2J4Go#]!pꬡ)^Z:ٳ4BxEI@Va%pTKNrrBn/;G F 'kȟV hNn!(&p y<I#S54B1rŜ!T&/hUIbkkmk!wPpuU# QJlKt'?9*F"݉=bU(rV7bƨ0Z:i*"J PHoQM<:^Z550Ժ ? Ѽq emy8 w~5augF0 @I\s_t$*Za<bV5 5RoBZ4MC]jt79(jESq\4 ^pQx ߯ coܴFᏦ/^l *Tq73&APp0*́k$糎RPr?kގAS/²CH7m`T8EH7v&'lrB7à$Sj22TzMS -[rNh]<;o{C ܓ= ]]o~:1ފ^?p{,a; CrseVgO?㗚;>́bj|cȲ' * K@QP7Z?jQyD(ABȯgS4e5En^ȽÒDuO`Dk`IU}@둛f ۋ\U͋-f7\MW-$ό[+E=dIx$ gi\ IPKw@M˦|h 60'zwpc$8Ԧ׎X*@ך:= kw*tN3A Fy%Pgry YztPW;۶Zxr4Y2ӣ/Jt&g+N2$dO) ,x mdН]"jEP{ ProуЍcMpI[!X{ N(Y)iecxíӪBKm=# sLPg\ښCM!䔮?qTQC*CpeE]9zA(PCgtP\+n]w/>n8@ACe@.igm]$/WOfs%CV۽U=U(3e$eÕt)hT,b5ρ&|>[]6"|j})2٦oWZ~qfbnuՋ6nd!v cm4> w~r'PXg>fjV!aкuzL{Ԃ[ ȷiߞkpc5&h K ]/xy\e݅ǴA^<(|e[( y;O *?I$N"JV"'( \$--)`@;L։bE(X1"eH8;s=9+Ե6nHJ9RA~焪 QOn\U7M[C R*A(1NւJMP؊QJ@g0RPFW)TN*ckfi1-kN3ôsT̃Rg;[+- 7Z4qJiIIƨ&kW1Y^.UbtCҨ&nwJpUn2Tk Ƹj~<]9m׶ݎT ~Qovt\5٭ Awá H(&[tra [aSʹ] imx"2vCÉl'0^]*gHz w7߅!%%D^9M,3.4g]=KBg[kSQ^ٕ=EcOŪxw]}zQ:TSEAռ@BI%'c.ɯînj' Lm/y{v J;cu;iւn0XbG>:{V%2TICl4.~@ڽ{Qz<'oy&&[N:?C}ʬ1p6pz+;yF EŜPNۚQǑBAjC# g=YJ5SƊ;$=P^F1B!C(!ȚjHĀ(#jrZ(4WTi~qBLrsj AzGlxeD /J'=["`r/"Ki}6xHpi 㡷ƝlQP:Ө%<ޮA28nW9@NKPxsx38V{Ε((vS?r%ų"!`|L[ҸHqAi|I M XBoLZ 'k/1"nȁ\\!}ۼD@ufB=0;Z~cud}JoRg_S( bɭB>^OcTzetӌӓn74H(ۯl~; \(N^*y3Bg#d~wg%B癋s% ?9pyLu}ێ!YX(9ˊfsl'"~Q֖@0 EdH!h S,.UCJˣ 2P`Q|92{\ǻ?Q8) E$3|cOsץ[H-5^PdԒ"4"MV\/F<{[h@SPDc(IekDSU2Ǭ^AJghjofM9Fs Slpm6OQ;YќJծrJJkk*ci4z,<YKH'ѺԶ '69*e v$Њ DnZe$R 5AYB(ʅ{Uqv2pc4fQǸu1~$4t 3KýI$1z"Ӆ{Otw梛½d\ýډCEߡýq:R7fc^ɼ*_7{C5#r^K_QTACɯNWm_w!9} AB(:{jyrui{e峠~qlv~nkAF?X1{8M??E /B"D /B%܎z~nLMͥοQe2%9*F~' *DZ`%!]I HTUɵVُPsṒD~Ti$Z:k6'TCZŠ"Qeɩ8(TU$>RS{ +ty$`.m,Ch'ޣ‘ m}'SUЍpB?\p8C -tZ qa[oGf4i\.CH[ B^k{Z_ۇp3jxBȊū5ez8W=%@J9'_v ;l^4*1(,EsEIG[as8YܫބVy"sC9"~a67:՞Zs ,°6p -ˎZ1T.Y%OaNV\Z@2c<{{k³jSgUR;k1ȆEYɧgSnk6eQ 72>_ȡB$.n3PB$3> !$MM"͡y߼ޗJ2IGBu@@w֎1#|N--i]d0] ^$#M+5`D@qĎ<QX_j-jh\\=Q;U2%;" הp(aCN S^,ebbÉQ t*%8C@mk?Z{.U'>mΫ,N, ө"bm0c~U@O/o/y/WY+Rd"kgA)PBB%c D+ Z`zҒejgpww2* .I K$Ex+<¾Ć!:m&'󇋻NQ!NUE"9*P^Τ4v䁄~@t9M+vRk08v! E'1O|J{Շjy_~9IYa!s*̬IK '%3ͷL6/fuk ava+$Fv,c*d/sC+Q8mq~d_} I(FW*jaoig0'L :MEԌ"v+E10y㫗iq -IϱDzp@SϤJiH UF /ێ&=?cy:֜kzUDm0F*^E~r㧗[ Zpzk%&iu˘3kԂ!I(yWp2`Y=j#0+ˆ5p(Zn@c|W/z[XN3U"9#(q#vʶL>iᄬݟN4'?/VN:u(GeN2і6<=F{ֱEf9HDcwƷ"?غSdDsFxD@O=SGm6Ux~] H(\|bY^ise{xn&X-KSM>(Zb^m6RaVqnGWWvh%$ܶ*(I"\(Ȗao@dWh#EܛNYщ:]wj9ޜEf6U9(!DpPe.}y jy8f$>vw7Msej;d2!Jeq0>k/iPdǵj\NYXiU4ʎ8pZ̥r3q#,@dɋmH -kgL!\.&O_} Ys P#ҋeG{QWAWp^'Ej#kE8A!',W.*Lp8Eg;2бɑ+ӵ# {صx0Z4 F~ 1Nח6(xl~0puDJ6Wĺ$'^E{lKb>&#{h2V=") J;kD͜mBd۫ND6Le K YVz!OTG.pY<[V$R rkJ2k>qmNXDF-Ff"F~KQqc )WIVÁdprBkv3ӐM>~r'N,.? MmW"DEH6 Zw=.NcO*'\r.b_Dqyq>wvBғшhLdAJ) ڍ=[l,I> <ڃS),pjK9)D!Jj'!vzy47pY΋1q-b Dž#z:>y$]}D'xBP6i?*jMGwaH@b;de,xb|MZWv>8G9 {xv9TJI/Ⱥv:7`9 c!yڗF.!M-ζz}9sػ.^~3~BPBgEa(IhrM<6xfz9*̓n~['Ӗ-|ۺyܭG5hXyW*#Uڮ.jIFm!*-ݲl)j~E Y ykxPv vnp_X ?u7,g#t9Tnha~ g嵁@yvv irO=ىtxwWVO9KcjMU2E/}`D%N'>btΤɦmK0q}m@kt1A:F!]@AX𶭯=7e5=mvvJ.pE"]r"UF[BF Hyж>GcFX:(2X/cD+J*Fv;clw=[[m&'X4W$بT6rzw]*)PJE9k4[`QeGRÙVJYty5zop߅LmkB*Ixd\t^a':9ɷ% k|kQdfEE"b] (#oiNEz8E]vYbŨ5',y4?[|gbRwaͶMv>N1ΘC8Npiq!rBD(拘W]:(FXD,/JP9=XؼmbMDmgO/O9$rET>2t_t=(k.NvrG?:HWU*A܍LZ0`>*nz5ϣ8_Y{[[V-Ig9*=)*5b7{-7 J{5ۧvL(Eu'%BRVF'L;'^}so/.;kYI cm9 !ė<5~>yκby'H+\6'"9q,zc )@`:S>Dj%9,bߺc{FS*5ح /O.ft2 zqѹځ(j>qK"_$ɌVYVuMղ/>ܱ/!1d.rC!l걗mmuʴj=4*_:YRC;~C#BTR`NȦa=_7ƒ;svz  [w'(JJNm2ց*0RH i)y-ky&2`e޺> ԍ[}-.";)$'h^ cԮ 9:~KE(7((8$I!`h9M#̘6p^/}mѥV"!9skUH}F0gM{8孝 [ߦ=卲'ךfi@"X>CZ^mĽlzW# 1"*bI_I+G:vB'PCqwl+7eIlp\4}}:OqpK?>'rwVo.ګz\kvS荜9<0l ]6xxtZȶj);']jls~3b8i}cJo='oƴ̛%Rti%11b< CU_i"vp+?+)HywuwOn~dY,E*?F9j8B)c9a,IF\N墰g4JLQX5nѫO1#m6-)"Yggvq-TP]I|EzJр<%?,]%RYrI3ZK%F5EUMzgJ`;.˔nH9TMF&L$C*#^I.[&as +0k.׋5wmzt+o}x )8#l~ _A؄lmvGHb]-e_1(vvv|3ڼix@wY,pA[R檴o}4Zn`~-ۂoE+l9&ǡ pfC'8M6d qqJ&zuxm ,~k"lK&[է#VLG,lWA;70Aoga<;QmJzVSso* qsnۑ&%xR.(G~8a[;)B&D0baޟLCKnpJ)^6wR{88W[^8~~SŅxw'Tqa|ФG}>jJ+)4e&. 6Ob|/x.bfmU_uƓ.~%Y`qߤHWc^zR޲5&x» &[ 3o|WFϓ BxX4(8H3,eEHԽ͍m >׌Xua#|}K-f 9F$]CSP_Wk2 )\ $U9^h<,R,GdӈioD,x \MuZi Xc/;BrZk-ZdN |a8=72y|e '\<|EbDI-L7#qV65lJ)紦9_/ۚRj.Rjrn8րZa ASj6G"y_IMZ6jVӋSъb5vQinjK1e`j.2pig`D2唄1Fe@{kP`5-w,{$c_P}:<])GMd'ui\W,4a8V(y-w]/rP 0 z55* YO}oCK' _vQ"oð?׷;[Si }t!\ A%:$D~\OjݘDKjDAFA"iN0D\2&%;C"aG\Ⱥ?Tq^B#ØE*X %^[òanl'a$h$L.jo`,D\ KDY8`IGZlJ a e++`<.,Q4y%@sl,jB2 c>0jNH_iCjEIl+UdipzKjZQ ,c4m";&DWđߗ1JƹO̧=fҭY:׆ɘ怩hR2q)6s\g#rQ2$h^\ԡfo#Y NA{Ɂ69ofU߷{U l?:Xyvhw=le3S&oʟӬrȎ0d_NדW6{w_ 1o=Il 'SJ=g;Ei%kiO vJ9?RsK*0²{I\m#kur_ 6LE"r9Wb2zEt9= {z-?ю#XM=~cH)Sdԓ eVvml*2/Mgzrί\aͥw.]y Z> 6ޑpIxwŨYSo?oo?<0-x3Wsxb*& ; ˛NO%#j[ʟMMYyg`K|47 i͸"޿ Z_RuȘvwt|بJEO4!x.1$)#|BSJ8Xo[s\_v"FP',|%V:Qr%ͱ Qf@mդ"B +[3)xRSk"X?)Ho8mǔn٤,$|D,s(A,֛i _!rc" kwqSx+r8"X +rAT*Q;)mc%hAk>1.|^aɢ!, 1ƢJ`MT[hLɣGۂkI p[8AD:8#YFI0ކ1w&vs)INyQv&ՕgW~c`ß(IS6h˱2( wk.3 dk=-liwGKvgax72QͿ#Yҷ琓v5-(Zr=Ysk2`Kh&)ir-@~\Zxlx Qd`"]Ja)q]s^LU~EH)GyAPЩ5. R8-_i @s4VF8 ˵t7LԻ1(GM:UPy7fZvc[fv5>$娶qf4yKXF6KN3kL fGcTac$ðS.*\Klm=P'I 6 aaB♄(wmH_a]nw(U,m*ً+.)0$Ɣ(NƐIˤb@i7@b  *q\9uhDB3-1#<5Zhn/?ݩR9+C i! ڢ`iH"hIۈ,HN5aQ,p6Br;a&TaH >"Z z@,7@PđRg ZryjF895yP!0[\%b5|9`5\4˴>!@EҒE#UpHض8ݙet? +bw.%pfX炵92Uم |쮴[s<*p`f57T3TJΡ64-żEr[M64E= 09ٽ}2$k)E<&X]AK`M$B Uv ұT߼Z:nEXD#fcs45.\,+ɇ_Ӊ#qԞL֢FӉ:׍ %h $I<۞Mk:`lx՜iDWz=/&F:4woa }Ca-*Kr@")X>^ %/R4cX  eU^_fy^E(m_gŘaYb̟}>o$^4d5ۧ/uc/s 0AĐ %Afщb?OAC?Ț&O7hn ,|?> uZa/!7Oo~bW}_m>:ǫ&ADdg o4޽} 'y=vK.LKqϟn_Z`3Zfkç=Kǣ%_c`¨MicE%Ic^GPKqN hݏN]8PT*J\*MiK8[Lb@(jNԮYl  ;oûOdkIi<:3  اCI8!;qxp̝@Z2G}HA{hx<tO_6Cs F|e'%G8<4`ܳ:Q\:n066fjJΉґ'J,q ŪOu58d&Gf>%+u "Ĕnn-M |-q{:# Ev34n!V xMg^4};Yr?fuC=qɖ@ BԸBRh$kp"(l`@Zm\IYݐ| LV +  _3zӿ}|X\I?GCͣyd=״i5gWK8n1ỎtGdiQ%նU^*O\X%)*JTQD¦>s'rΐ>\aU9_T2)8;qR0,[:<'`6[ǕYYwlMTB}U?˪A uP| حaiZE_j]=d"c7lN54{au z,9Ϳ~yJ.4ڰUuTcՖ\A5Qŭ48ʛUA#{Ɋ T.s<~ǠBAQA\5L!UU ~i^nLycLyd2z~PBp2LluJYŹ?Ι(a|cT=#X<ɋ"x"wo$q& 9qS/JuAT}:mȔJ3V}kuCCN\E#ΗMkz֭*)SX}fZAnۺU?hАWэuJ)^qOxi}SDuz&j?_ǥɃXXskT~Ύz|Ogw1O7p|_v7Tcey,DI's-.ijoV4\T]qzZy謥GDi))N(BKN҂j*DX0<>KT`vңRhKcg%: miA5)>גLRqʎ:a9%\b2f)4R&2lb'+eViA&qę qictΜtAJ xu!׸l4ffx#/?۠%!` reTrAbyٷ_Ⱦ"'E"8#Lo+]ٽK'VRltu'n/AجȎYx)T=e&轋ev%нN]+77J ˻ykzoo^p"6#]L(P|ܼ)._qe5. Q1c{xwjw\o:ް#-8.XRLM1:s4ϴ$h JFrd, IXv^:]QL] іX*rjv_*VF*2%ڎ0Du  cn ˷6|XRKlߓ/`s|Kv\#X"tjjτp>րPr,,byR\2ں5 #KoUC9H?EkD!t g+* Ȃ?'ThcF,F-zڰhr( !#A "2X˕J9ew<ܹag|9t=(MK5ܗB+CW%%9>_, x aB P4/^cq6Jf98A"C!i{įfjP8~>x'R5^ p zR> gӗ` >-iט9DΨěl֔XFdUxH- o@-ҩUL>3Q O g WWiL!+f-R褼3wUlMB|ܘJ(uHv_{>LSę-yS2F)+O{|+c̷JOIx/z@k극&7?=i"&;Nm7&~kIQV^3N-:#*JwMo݆c/xt犾Zۭ'H*9!|O`kk %(\2BU @Z|I!˞zH5vIHeɉۛ}f )H>W єθd*\tK ?aM Ҝrma6*𖫋{ɱx+/,+4ST,% ά\2tL#BdKTe0kVļР~^JcRra3ʱ6:PI8Rjтm2'vIs 2R%RF HvS໛I2WsW'EqiYfnW)c/oxꔏ JP ^-RZN.M!^q_t@7C Ukҽ g!zl!7w@=. D|^=<}1քp nLgK{S}ER7<J;p0Oa^At0EGLf?R!lȃ&sp7(t5ΉGhEcvfDjLTLrGh1\]3^N0Ez/5QjP69[UmEB2&pUkm`22 a#2s\v ;?B*?= ^vBVW,w~K]գ[I ㍆LqP=ۢ&*.n/{E-N獷7ټƴCu={]Mrjh<<ܬǫdA3攻 k;-?.ٝp2&{sG5wռ޽7>R>r *vnUD9?jICF"GeO{n#~fc&U,YyTԘ AT@N^8|dDsjܕj`)O><ˁQbl\qɷ5p"ITlf%Z}b4a膘6=7ЄmQmQ cy}F2? pOdbU4J@޲n+ɱ}fD ](S!UwY%}JZА'1:% 9u3(n0nnmhW:5J[s5?-]ɻƂZk;߽z`ۣn7>lVDhn/)wFh'?3d rF^`s ^y G2^|,z¶Hht6Wg.J2$ 5lA k{2`qdcߊ'Q#ʞ}_j9yvP*8MU)lHySb|{J;o 0݀ԫ};4_m:uh|J $ӣb*۾YrcM+_~ ^˙Qԡ+`#|D{ScJ?8ad kks0B)5?Lۦ6vC6ȕI' My(> l" I;:hBk:J8"WF~T0e5I5x2f+90>)s`-lo gK@5w[(C4e{fgRRTV@<@{E*hv%>KC1v i<(ɨ*kGڕ,I1}Srl|9_iϖ)T F1N[³EDW/' S.Gss@?91g=u)d}\q~8 ]A$uݢ`҉]sN/ϫz e:MLVB0(L* ܂$&0 J14:`  ~ry[J|99,R4Z9 t95fn}<ڡ?[ d*. ߫£7jq<n$\@S2 ~TF43b~'l [Q{rs?֛Eɚ෋ą6Q2S|$VkIȆ,,JZ C=#Z6P5'= PGA}JsDsZ@ "+l"x\+ĜQXCL s% )DTM63 H-I4 Is3E,V ˙-JhR COƀMDjvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003244211515136466320017705 0ustar rootrootJan 28 18:35:30 crc systemd[1]: Starting Kubernetes Kubelet... Jan 28 18:35:30 crc restorecon[4705]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:30 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 18:35:31 crc restorecon[4705]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 28 18:35:32 crc kubenswrapper[4749]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 18:35:32 crc kubenswrapper[4749]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 28 18:35:32 crc kubenswrapper[4749]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 18:35:32 crc kubenswrapper[4749]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 18:35:32 crc kubenswrapper[4749]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 28 18:35:32 crc kubenswrapper[4749]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.655479 4749 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658164 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658183 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658187 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658191 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658196 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658200 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658204 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658208 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658212 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658216 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658223 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658226 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658230 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658234 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658237 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658242 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658247 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658250 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658254 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658258 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658261 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658265 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658269 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658272 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658276 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658281 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658285 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658289 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658293 4749 feature_gate.go:330] unrecognized feature gate: Example Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658299 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658305 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658310 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658315 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658320 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658340 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658345 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658348 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658353 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658357 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658371 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658375 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658379 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658383 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658386 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658390 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658394 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658398 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658401 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658406 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658411 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658417 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658421 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658425 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658429 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658433 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658437 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658440 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658444 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658447 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658451 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658454 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658457 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658461 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658464 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658467 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658471 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658474 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658479 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658484 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658488 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.658530 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660152 4749 flags.go:64] FLAG: --address="0.0.0.0" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660176 4749 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660194 4749 flags.go:64] FLAG: --anonymous-auth="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660202 4749 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660217 4749 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660223 4749 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660230 4749 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660237 4749 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660242 4749 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660247 4749 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660252 4749 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660257 4749 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660262 4749 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660267 4749 flags.go:64] FLAG: --cgroup-root="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660272 4749 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660276 4749 flags.go:64] FLAG: --client-ca-file="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660280 4749 flags.go:64] FLAG: --cloud-config="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660284 4749 flags.go:64] FLAG: --cloud-provider="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660288 4749 flags.go:64] FLAG: --cluster-dns="[]" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660297 4749 flags.go:64] FLAG: --cluster-domain="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660301 4749 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660305 4749 flags.go:64] FLAG: --config-dir="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660309 4749 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660313 4749 flags.go:64] FLAG: --container-log-max-files="5" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660319 4749 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660338 4749 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660343 4749 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660347 4749 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660351 4749 flags.go:64] FLAG: --contention-profiling="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660356 4749 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660360 4749 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660365 4749 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660369 4749 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660375 4749 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660380 4749 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660385 4749 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660390 4749 flags.go:64] FLAG: --enable-load-reader="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660396 4749 flags.go:64] FLAG: --enable-server="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660401 4749 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660412 4749 flags.go:64] FLAG: --event-burst="100" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660425 4749 flags.go:64] FLAG: --event-qps="50" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660431 4749 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660436 4749 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660441 4749 flags.go:64] FLAG: --eviction-hard="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660448 4749 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660454 4749 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660459 4749 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660465 4749 flags.go:64] FLAG: --eviction-soft="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660471 4749 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660477 4749 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660482 4749 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660487 4749 flags.go:64] FLAG: --experimental-mounter-path="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660492 4749 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660497 4749 flags.go:64] FLAG: --fail-swap-on="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660502 4749 flags.go:64] FLAG: --feature-gates="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660509 4749 flags.go:64] FLAG: --file-check-frequency="20s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660514 4749 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660519 4749 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660524 4749 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660529 4749 flags.go:64] FLAG: --healthz-port="10248" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660534 4749 flags.go:64] FLAG: --help="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660539 4749 flags.go:64] FLAG: --hostname-override="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660543 4749 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660548 4749 flags.go:64] FLAG: --http-check-frequency="20s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660553 4749 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660557 4749 flags.go:64] FLAG: --image-credential-provider-config="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660561 4749 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660565 4749 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660569 4749 flags.go:64] FLAG: --image-service-endpoint="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660572 4749 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660577 4749 flags.go:64] FLAG: --kube-api-burst="100" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660581 4749 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660585 4749 flags.go:64] FLAG: --kube-api-qps="50" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660589 4749 flags.go:64] FLAG: --kube-reserved="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660593 4749 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660597 4749 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660608 4749 flags.go:64] FLAG: --kubelet-cgroups="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660612 4749 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660617 4749 flags.go:64] FLAG: --lock-file="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660621 4749 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660627 4749 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660632 4749 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660638 4749 flags.go:64] FLAG: --log-json-split-stream="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660643 4749 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660647 4749 flags.go:64] FLAG: --log-text-split-stream="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660652 4749 flags.go:64] FLAG: --logging-format="text" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660656 4749 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660660 4749 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660664 4749 flags.go:64] FLAG: --manifest-url="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660669 4749 flags.go:64] FLAG: --manifest-url-header="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660679 4749 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660683 4749 flags.go:64] FLAG: --max-open-files="1000000" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660688 4749 flags.go:64] FLAG: --max-pods="110" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660692 4749 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660696 4749 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660700 4749 flags.go:64] FLAG: --memory-manager-policy="None" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660704 4749 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660708 4749 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660712 4749 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660716 4749 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660726 4749 flags.go:64] FLAG: --node-status-max-images="50" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660730 4749 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660734 4749 flags.go:64] FLAG: --oom-score-adj="-999" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660739 4749 flags.go:64] FLAG: --pod-cidr="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660742 4749 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660749 4749 flags.go:64] FLAG: --pod-manifest-path="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660753 4749 flags.go:64] FLAG: --pod-max-pids="-1" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660757 4749 flags.go:64] FLAG: --pods-per-core="0" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660761 4749 flags.go:64] FLAG: --port="10250" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660765 4749 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660769 4749 flags.go:64] FLAG: --provider-id="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660773 4749 flags.go:64] FLAG: --qos-reserved="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660784 4749 flags.go:64] FLAG: --read-only-port="10255" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660789 4749 flags.go:64] FLAG: --register-node="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660793 4749 flags.go:64] FLAG: --register-schedulable="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660797 4749 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660804 4749 flags.go:64] FLAG: --registry-burst="10" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660808 4749 flags.go:64] FLAG: --registry-qps="5" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660812 4749 flags.go:64] FLAG: --reserved-cpus="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660815 4749 flags.go:64] FLAG: --reserved-memory="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660820 4749 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660824 4749 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660828 4749 flags.go:64] FLAG: --rotate-certificates="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660832 4749 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660836 4749 flags.go:64] FLAG: --runonce="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660840 4749 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660844 4749 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660849 4749 flags.go:64] FLAG: --seccomp-default="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660853 4749 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660857 4749 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660861 4749 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660865 4749 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660870 4749 flags.go:64] FLAG: --storage-driver-password="root" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660874 4749 flags.go:64] FLAG: --storage-driver-secure="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660878 4749 flags.go:64] FLAG: --storage-driver-table="stats" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660882 4749 flags.go:64] FLAG: --storage-driver-user="root" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660886 4749 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660890 4749 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660894 4749 flags.go:64] FLAG: --system-cgroups="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660899 4749 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660905 4749 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660909 4749 flags.go:64] FLAG: --tls-cert-file="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660912 4749 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660926 4749 flags.go:64] FLAG: --tls-min-version="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660930 4749 flags.go:64] FLAG: --tls-private-key-file="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660934 4749 flags.go:64] FLAG: --topology-manager-policy="none" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660938 4749 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660942 4749 flags.go:64] FLAG: --topology-manager-scope="container" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660952 4749 flags.go:64] FLAG: --v="2" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660958 4749 flags.go:64] FLAG: --version="false" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660963 4749 flags.go:64] FLAG: --vmodule="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660968 4749 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.660972 4749 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661119 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661124 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661128 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661133 4749 feature_gate.go:330] unrecognized feature gate: Example Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661137 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661140 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661144 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661147 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661151 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661154 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661158 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661161 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661165 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661168 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661172 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661177 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661182 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661187 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661192 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661197 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661201 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661205 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661210 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661214 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661218 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661221 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661225 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661228 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661232 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661235 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661238 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661244 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661248 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661253 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661257 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661260 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661264 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661267 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661272 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661276 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661279 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661282 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661286 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661290 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661293 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661296 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661300 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661304 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661307 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661312 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661316 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661320 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661341 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661345 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661350 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661354 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661357 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661360 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661364 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661368 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661371 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661375 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661379 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661382 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661386 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661389 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661393 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661398 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661401 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661405 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.661409 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.661415 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.669595 4749 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.669637 4749 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669698 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669706 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669714 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669718 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669722 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669726 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669731 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669736 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669741 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669745 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669749 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669753 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669757 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669761 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669766 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669770 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669775 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669779 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669783 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669786 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669790 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669793 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669797 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669800 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669804 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669807 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669811 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669814 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669817 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669821 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669824 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669828 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669831 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669835 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669839 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669842 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669846 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669849 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669853 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669856 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669860 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669863 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669867 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669871 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669875 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669878 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669882 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669885 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669889 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669892 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669896 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669900 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669903 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669907 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669910 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669914 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669917 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669921 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669924 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669928 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669931 4749 feature_gate.go:330] unrecognized feature gate: Example Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669935 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669938 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669943 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669948 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669952 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669957 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669961 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669965 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669969 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.669973 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.669980 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670081 4749 feature_gate.go:330] unrecognized feature gate: Example Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670089 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670095 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670099 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670104 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670108 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670111 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670115 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670120 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670124 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670128 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670132 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670136 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670139 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670144 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670147 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670151 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670155 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670160 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670163 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670167 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670171 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670174 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670178 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670182 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670185 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670189 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670193 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670196 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670200 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670203 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670207 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670210 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670214 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670217 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670221 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670224 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670228 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670231 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670235 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670239 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670245 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670249 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670253 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670257 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670261 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670264 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670268 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670272 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670277 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670280 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670284 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670288 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670292 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670295 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670299 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670302 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670306 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670309 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670313 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670316 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670320 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670344 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670347 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670351 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670354 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670358 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670362 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670365 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670368 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.670372 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.670378 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.673650 4749 server.go:940] "Client rotation is on, will bootstrap in background" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.684394 4749 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.684506 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.685900 4749 server.go:997] "Starting client certificate rotation" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.685933 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.686150 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-08 21:34:34.066816463 +0000 UTC Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.686275 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.711904 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.715472 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.716359 4749 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.736359 4749 log.go:25] "Validated CRI v1 runtime API" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.773781 4749 log.go:25] "Validated CRI v1 image API" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.775225 4749 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.778981 4749 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-28-18-30-42-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.779070 4749 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.793721 4749 manager.go:217] Machine: {Timestamp:2026-01-28 18:35:32.790532139 +0000 UTC m=+0.802058924 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8a21bb4d-a47c-473a-90f1-3a8d36045974 BootID:67fbc4d2-6dc5-4849-8697-e9f75acd0da5 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:67:7f:1c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:67:7f:1c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:68:68:e8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:49:17:33 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d0:e0:b6 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e6:ca:2a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:46:2d:d0:82:5c:7b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7e:be:ac:fa:08:84 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.793951 4749 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.794098 4749 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.796902 4749 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.797069 4749 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.797101 4749 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.797271 4749 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.797279 4749 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.799173 4749 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.799202 4749 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.799461 4749 state_mem.go:36] "Initialized new in-memory state store" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.799541 4749 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.804361 4749 kubelet.go:418] "Attempting to sync node with API server" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.804383 4749 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.804409 4749 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.804422 4749 kubelet.go:324] "Adding apiserver pod source" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.804434 4749 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.808608 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.808669 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.808656 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.808812 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.810245 4749 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.811174 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.812388 4749 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.813928 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.813950 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.813957 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.813963 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.813973 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.813980 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.813986 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.813997 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.814003 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.814009 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.814030 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.814872 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.815897 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.816356 4749 server.go:1280] "Started kubelet" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.817556 4749 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.817534 4749 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.818206 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.818564 4749 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 18:35:32 crc systemd[1]: Started Kubernetes Kubelet. Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.821052 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.821755 4749 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.821829 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 04:31:42.86255704 +0000 UTC Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.821777 4749 server.go:460] "Adding debug handlers to kubelet server" Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.822484 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.822944 4749 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.823000 4749 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.823171 4749 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.823581 4749 factory.go:55] Registering systemd factory Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.823630 4749 factory.go:221] Registration of the systemd container factory successfully Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.823679 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.823902 4749 factory.go:153] Registering CRI-O factory Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.824010 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.824030 4749 factory.go:221] Registration of the crio container factory successfully Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.824610 4749 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.824696 4749 factory.go:103] Registering Raw factory Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.824716 4749 manager.go:1196] Started watching for new ooms in manager Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.825722 4749 manager.go:319] Starting recovery of all containers Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.825768 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="200ms" Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.825892 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188ef8dd440c8052 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 18:35:32.81631445 +0000 UTC m=+0.827841245,LastTimestamp:2026-01-28 18:35:32.81631445 +0000 UTC m=+0.827841245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.832923 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.832985 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.832996 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833005 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833015 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833024 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833034 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833043 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833056 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833081 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833090 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833102 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833111 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833121 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833143 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833152 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833162 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833173 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833184 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833194 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833224 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833236 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833244 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833252 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833264 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833274 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833286 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833306 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833316 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833348 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833379 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833391 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833403 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.833416 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.836986 4749 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837256 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837278 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837292 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837305 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837319 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837351 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837366 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837382 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837403 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837417 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837431 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837444 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837458 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837471 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837484 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837503 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837516 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837530 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837550 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837565 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837581 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837597 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837612 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837625 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837639 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837655 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837671 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837695 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837707 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837721 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837733 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837746 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837764 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837777 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837791 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837804 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837817 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837832 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837847 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837861 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837884 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837898 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837912 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837928 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837942 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837957 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837971 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837984 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.837998 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838009 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838021 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838030 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838046 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838057 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838067 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838076 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838088 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838101 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838119 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838134 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838146 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838157 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838167 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838183 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838197 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838210 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838224 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838238 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838252 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838265 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838285 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838306 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838344 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838361 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838374 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838578 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838599 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838611 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838625 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838639 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838651 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838664 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838676 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838689 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838702 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838716 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838730 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838743 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838757 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838770 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838786 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838800 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838813 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838823 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838835 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838849 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838864 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838878 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838891 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838905 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838917 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838929 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838940 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838956 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838970 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838984 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.838996 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839008 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839021 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839253 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839268 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839281 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839293 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839305 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839318 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839375 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839388 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839401 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839412 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839424 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839437 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839449 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839465 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839478 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839490 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839502 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839514 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839527 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839540 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839551 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839563 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839578 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839590 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839603 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839616 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839630 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839644 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839657 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839670 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839681 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839695 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839707 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839719 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839733 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839745 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839756 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839767 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839778 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839787 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839797 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839805 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839814 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839825 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839834 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839847 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839860 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839873 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839885 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839895 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839905 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839917 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839930 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839943 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839956 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839968 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839981 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.839994 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.840007 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.840023 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.840036 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.840048 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.840061 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.840074 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.840086 4749 reconstruct.go:97] "Volume reconstruction finished" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.840097 4749 reconciler.go:26] "Reconciler: start to sync state" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.847809 4749 manager.go:324] Recovery completed Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.857959 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.860104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.860242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.860346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.861273 4749 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.861292 4749 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.861311 4749 state_mem.go:36] "Initialized new in-memory state store" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.868344 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.870104 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.870142 4749 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.870166 4749 kubelet.go:2335] "Starting kubelet main sync loop" Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.870221 4749 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 18:35:32 crc kubenswrapper[4749]: W0128 18:35:32.873589 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.873654 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.880575 4749 policy_none.go:49] "None policy: Start" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.881415 4749 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.881517 4749 state_mem.go:35] "Initializing new in-memory state store" Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.923653 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.937065 4749 manager.go:334] "Starting Device Plugin manager" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.937104 4749 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.937116 4749 server.go:79] "Starting device plugin registration server" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.937590 4749 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.937602 4749 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.937778 4749 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.937846 4749 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.937857 4749 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 18:35:32 crc kubenswrapper[4749]: E0128 18:35:32.943373 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.971339 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.971493 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.975115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.975242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.975252 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.975427 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.975597 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.975647 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976509 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976623 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976658 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.976791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977609 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977761 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.977798 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978358 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978399 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978422 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.978452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979240 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979263 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:32 crc kubenswrapper[4749]: I0128 18:35:32.979810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:33 crc kubenswrapper[4749]: E0128 18:35:33.026616 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="400ms" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.037836 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.038821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.038860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.038870 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.038895 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 18:35:33 crc kubenswrapper[4749]: E0128 18:35:33.039423 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043672 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043689 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043949 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.043989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.044021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.044042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.044091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.044115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.044139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145496 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145544 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145541 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145638 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145597 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145738 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.145936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.240502 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.241819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.241873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.241883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.241910 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 18:35:33 crc kubenswrapper[4749]: E0128 18:35:33.242422 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.314255 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.339026 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: W0128 18:35:33.354667 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5e65dc445d223305d80984fc025b131c9bf442556d029d51239f6008707397d6 WatchSource:0}: Error finding container 5e65dc445d223305d80984fc025b131c9bf442556d029d51239f6008707397d6: Status 404 returned error can't find the container with id 5e65dc445d223305d80984fc025b131c9bf442556d029d51239f6008707397d6 Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.358052 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.366183 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.371318 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 28 18:35:33 crc kubenswrapper[4749]: W0128 18:35:33.374093 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-57758fe9428627c44f2a0e72f993904e12aac18887af112145dfd7b4c950c080 WatchSource:0}: Error finding container 57758fe9428627c44f2a0e72f993904e12aac18887af112145dfd7b4c950c080: Status 404 returned error can't find the container with id 57758fe9428627c44f2a0e72f993904e12aac18887af112145dfd7b4c950c080 Jan 28 18:35:33 crc kubenswrapper[4749]: W0128 18:35:33.385423 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a2b5d2fc45bcb7b4f40df5dfd7d830b99310f49b281c4327df785ac02f0216de WatchSource:0}: Error finding container a2b5d2fc45bcb7b4f40df5dfd7d830b99310f49b281c4327df785ac02f0216de: Status 404 returned error can't find the container with id a2b5d2fc45bcb7b4f40df5dfd7d830b99310f49b281c4327df785ac02f0216de Jan 28 18:35:33 crc kubenswrapper[4749]: E0128 18:35:33.427918 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="800ms" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.643299 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.644424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.644461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.644477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.644503 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 18:35:33 crc kubenswrapper[4749]: E0128 18:35:33.644921 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.819630 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.822743 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 23:19:40.980305339 +0000 UTC Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.874200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57758fe9428627c44f2a0e72f993904e12aac18887af112145dfd7b4c950c080"} Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.875183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e65dc445d223305d80984fc025b131c9bf442556d029d51239f6008707397d6"} Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.876507 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a2b5d2fc45bcb7b4f40df5dfd7d830b99310f49b281c4327df785ac02f0216de"} Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.877473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"58eb4241cb826404426a5a4d23ba7de7f3f2b8306446a7fa62b71e39a26b220a"} Jan 28 18:35:33 crc kubenswrapper[4749]: I0128 18:35:33.878443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6746bb985b6af52844e2cb0f2553765161d8c8ddf62817f8e52131db27d99f4a"} Jan 28 18:35:33 crc kubenswrapper[4749]: W0128 18:35:33.959796 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:33 crc kubenswrapper[4749]: E0128 18:35:33.959906 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:34 crc kubenswrapper[4749]: W0128 18:35:34.220238 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:34 crc kubenswrapper[4749]: E0128 18:35:34.220589 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:34 crc kubenswrapper[4749]: E0128 18:35:34.228204 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="1.6s" Jan 28 18:35:34 crc kubenswrapper[4749]: W0128 18:35:34.277771 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:34 crc kubenswrapper[4749]: E0128 18:35:34.277890 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:34 crc kubenswrapper[4749]: W0128 18:35:34.389545 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:34 crc kubenswrapper[4749]: E0128 18:35:34.389700 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.445535 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.447207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.447239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.447248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.447268 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 18:35:34 crc kubenswrapper[4749]: E0128 18:35:34.447856 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.819853 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.823116 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 19:15:47.192484131 +0000 UTC Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.883691 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e" exitCode=0 Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.883767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e"} Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.883876 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.885019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.885065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.885083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.885888 4749 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="40b88793b9913d601f0549f3af57d50dfa86a81ed03aa26a00e422feeb6f28b1" exitCode=0 Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.885959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"40b88793b9913d601f0549f3af57d50dfa86a81ed03aa26a00e422feeb6f28b1"} Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.885973 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.886764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.886791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.886804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.887622 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="369c6b3c9348ac2a735d1139977c522aed7fb0943e3dc89906c89192f4b167ea" exitCode=0 Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.887674 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"369c6b3c9348ac2a735d1139977c522aed7fb0943e3dc89906c89192f4b167ea"} Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.887706 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.888186 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.888376 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.888398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.888409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.888786 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="79921c2fff60bfcf1cbff244afcdabaaa40fd66e66961d593cfb7080a98db25a" exitCode=0 Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.888843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"79921c2fff60bfcf1cbff244afcdabaaa40fd66e66961d593cfb7080a98db25a"} Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.888970 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.889293 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.889344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.889354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.890002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.890021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.890032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.891869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ff819de1319ae7769ef146eed6d66f96b0b994f8dc3ee23caec83aed9d1d3c72"} Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.891906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d5ceee13b1e49d67f742fe7b004ab801db448db4b9faa57be817dee3e352f6d"} Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.891920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79fa232e65f44d21af327147ba2791e02af9b3bb94baf21e2836eddfd45f1fbc"} Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.891932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6dcb8a332b44be3152800d1d89f6a56f6d0851de934ab0085a81a1eaa2afc002"} Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.891968 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.892458 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.893499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.894032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:34 crc kubenswrapper[4749]: I0128 18:35:34.894122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:34 crc kubenswrapper[4749]: E0128 18:35:34.894407 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.819403 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.823850 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 09:35:29.447030604 +0000 UTC Jan 28 18:35:35 crc kubenswrapper[4749]: E0128 18:35:35.829817 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="3.2s" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.896375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"508a64c5e44c28845ea36f13a310c543d25f17c1440d5a65d36b16e784a0eb16"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.896400 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.897118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.897157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.897176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.898451 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1c1e839c8db147191a1d94e5dc2f7b592921a0c31c0676aa0c8be6958572ee97" exitCode=0 Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.898548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1c1e839c8db147191a1d94e5dc2f7b592921a0c31c0676aa0c8be6958572ee97"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.898617 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.899770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.899811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.899824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.902087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3c1cf5c4057922414378daee981248934287870912fca5f91299a7c928d5e1e4"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.902147 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.902152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"000fa3419833066d62c4ad1264c085fb8c1a3712254a2b328fe9766c2365b658"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.902234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2043b8d35f4b0496c7ebab4af2a21d86aa5cb037d1e0dc70d00fd922ce820f40"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.903624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.903661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.903672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.905352 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.905792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.905817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.905828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.905836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71"} Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.906136 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.906156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:35 crc kubenswrapper[4749]: I0128 18:35:35.906165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:35 crc kubenswrapper[4749]: W0128 18:35:35.982611 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Jan 28 18:35:35 crc kubenswrapper[4749]: E0128 18:35:35.982699 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.048238 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.049255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.049292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.049303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.049343 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 18:35:36 crc kubenswrapper[4749]: E0128 18:35:36.054562 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.183981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.418200 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.423849 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.824713 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 13:01:32.486699266 +0000 UTC Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.911696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136"} Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.911767 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.912539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.912576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.912589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914052 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="69fbc05c67aff2d4b39509086f73224f312962afe1d3405c5af48160231a4b4d" exitCode=0 Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"69fbc05c67aff2d4b39509086f73224f312962afe1d3405c5af48160231a4b4d"} Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914130 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914187 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914215 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914228 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914343 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.914978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915362 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:36 crc kubenswrapper[4749]: I0128 18:35:36.915649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.149702 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.206788 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.825858 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 23:23:19.515679816 +0000 UTC Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"96e7926fe9e366cbc639340ca73c269c6bd69f60bf5caa7720143530eb22c8fb"} Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921324 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c55acaaff41da69137bfbc1cbc735af304ff4be2bb1847d95455f9c0bf178e1e"} Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921407 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0b96446078980d4d9ba0f2ccd5614743a10c0df8cbb3bb0505551ec226f6d296"} Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a024db850ab658cd708b53e796414299ab9e88a96c28b7b7710e2c4ce5539c7e"} Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7904849594ffeac9d8c2a598603820d762446f7efdb2d924fd1196bea9d6e5d8"} Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921528 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921582 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.921541 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922593 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922654 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:37 crc kubenswrapper[4749]: I0128 18:35:37.922855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.615500 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.826978 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 01:14:02.937451852 +0000 UTC Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.924680 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.924793 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.924793 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.925817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.925850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.925861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.926352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.926390 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.926421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.927195 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.927220 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:38 crc kubenswrapper[4749]: I0128 18:35:38.927231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.112422 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.255607 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.256969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.257225 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.257285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.257343 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.827936 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 11:11:01.466941817 +0000 UTC Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.891462 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.927291 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.927390 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.928684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.928761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.928777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.928708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.928965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:39 crc kubenswrapper[4749]: I0128 18:35:39.929046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.116888 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.117171 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.118666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.118715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.118732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.207375 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.207480 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.335891 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.828623 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 23:12:41.244602359 +0000 UTC Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.930160 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.931182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.931232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:40 crc kubenswrapper[4749]: I0128 18:35:40.931248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:41 crc kubenswrapper[4749]: I0128 18:35:41.829370 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 11:04:42.880071993 +0000 UTC Jan 28 18:35:42 crc kubenswrapper[4749]: I0128 18:35:42.829626 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 20:10:47.695507183 +0000 UTC Jan 28 18:35:42 crc kubenswrapper[4749]: E0128 18:35:42.943483 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 28 18:35:43 crc kubenswrapper[4749]: I0128 18:35:43.830171 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 06:33:26.181147325 +0000 UTC Jan 28 18:35:44 crc kubenswrapper[4749]: I0128 18:35:44.372219 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:44 crc kubenswrapper[4749]: I0128 18:35:44.372320 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:44 crc kubenswrapper[4749]: I0128 18:35:44.373479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:44 crc kubenswrapper[4749]: I0128 18:35:44.373506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:44 crc kubenswrapper[4749]: I0128 18:35:44.373513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:44 crc kubenswrapper[4749]: I0128 18:35:44.830705 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 23:44:14.351126188 +0000 UTC Jan 28 18:35:45 crc kubenswrapper[4749]: I0128 18:35:45.831682 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 14:44:27.030632304 +0000 UTC Jan 28 18:35:46 crc kubenswrapper[4749]: W0128 18:35:46.625831 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.625936 4749 trace.go:236] Trace[1716046010]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 18:35:36.624) (total time: 10001ms): Jan 28 18:35:46 crc kubenswrapper[4749]: Trace[1716046010]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:35:46.625) Jan 28 18:35:46 crc kubenswrapper[4749]: Trace[1716046010]: [10.001870964s] [10.001870964s] END Jan 28 18:35:46 crc kubenswrapper[4749]: E0128 18:35:46.625962 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.820392 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.832654 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 21:14:34.625732515 +0000 UTC Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.857053 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.857127 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.861273 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.861349 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.945143 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.946867 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136" exitCode=255 Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.946905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136"} Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.947054 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.947839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.947883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.947893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:46 crc kubenswrapper[4749]: I0128 18:35:46.948488 4749 scope.go:117] "RemoveContainer" containerID="16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136" Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.157290 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]log ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]etcd ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/generic-apiserver-start-informers ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/priority-and-fairness-filter ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-apiextensions-informers ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-apiextensions-controllers ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/crd-informer-synced ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-system-namespaces-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 28 18:35:47 crc kubenswrapper[4749]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 28 18:35:47 crc kubenswrapper[4749]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/bootstrap-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/start-kube-aggregator-informers ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/apiservice-registration-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/apiservice-discovery-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]autoregister-completion ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/apiservice-openapi-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 28 18:35:47 crc kubenswrapper[4749]: livez check failed Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.157424 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.541835 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.833400 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:31:22.699273193 +0000 UTC Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.952034 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.953586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569"} Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.953740 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.954634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.954671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:47 crc kubenswrapper[4749]: I0128 18:35:47.954686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:48 crc kubenswrapper[4749]: I0128 18:35:48.833992 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:41:50.663787614 +0000 UTC Jan 28 18:35:48 crc kubenswrapper[4749]: I0128 18:35:48.955152 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:48 crc kubenswrapper[4749]: I0128 18:35:48.955472 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:48 crc kubenswrapper[4749]: I0128 18:35:48.955917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:48 crc kubenswrapper[4749]: I0128 18:35:48.955969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:48 crc kubenswrapper[4749]: I0128 18:35:48.955978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.834673 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 03:53:14.536353198 +0000 UTC Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.920258 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.920553 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.921838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.921886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.921903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.935886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.958193 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.958492 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.959997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.960024 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.960051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.960068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.960030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:49 crc kubenswrapper[4749]: I0128 18:35:49.960175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:50 crc kubenswrapper[4749]: I0128 18:35:50.208245 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 18:35:50 crc kubenswrapper[4749]: I0128 18:35:50.208377 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 18:35:50 crc kubenswrapper[4749]: I0128 18:35:50.834816 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 12:24:28.946581118 +0000 UTC Jan 28 18:35:50 crc kubenswrapper[4749]: I0128 18:35:50.934265 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.835368 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 02:25:13.223479244 +0000 UTC Jan 28 18:35:51 crc kubenswrapper[4749]: E0128 18:35:51.858638 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.860166 4749 trace.go:236] Trace[1504499476]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 18:35:36.917) (total time: 14942ms): Jan 28 18:35:51 crc kubenswrapper[4749]: Trace[1504499476]: ---"Objects listed" error: 14942ms (18:35:51.860) Jan 28 18:35:51 crc kubenswrapper[4749]: Trace[1504499476]: [14.942235569s] [14.942235569s] END Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.860195 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.860770 4749 trace.go:236] Trace[974130561]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 18:35:36.969) (total time: 14891ms): Jan 28 18:35:51 crc kubenswrapper[4749]: Trace[974130561]: ---"Objects listed" error: 14891ms (18:35:51.860) Jan 28 18:35:51 crc kubenswrapper[4749]: Trace[974130561]: [14.891653177s] [14.891653177s] END Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.860796 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.862028 4749 trace.go:236] Trace[834333899]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 18:35:39.781) (total time: 12080ms): Jan 28 18:35:51 crc kubenswrapper[4749]: Trace[834333899]: ---"Objects listed" error: 12079ms (18:35:51.861) Jan 28 18:35:51 crc kubenswrapper[4749]: Trace[834333899]: [12.080115081s] [12.080115081s] END Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.862062 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.862250 4749 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.865165 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.866933 4749 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.867250 4749 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.868533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.868584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.868596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.868620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.868633 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:51Z","lastTransitionTime":"2026-01-28T18:35:51Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:51 crc kubenswrapper[4749]: E0128 18:35:51.882007 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67fbc4d2-6dc5-4849-8697-e9f75acd0da5\\\",\\\"systemUUID\\\":\\\"8a21bb4d-a47c-473a-90f1-3a8d36045974\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.886243 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.886291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.886305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.886342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.886356 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:51Z","lastTransitionTime":"2026-01-28T18:35:51Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:51 crc kubenswrapper[4749]: E0128 18:35:51.894581 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67fbc4d2-6dc5-4849-8697-e9f75acd0da5\\\",\\\"systemUUID\\\":\\\"8a21bb4d-a47c-473a-90f1-3a8d36045974\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.899160 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.899479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.899503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.899525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.899550 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:51Z","lastTransitionTime":"2026-01-28T18:35:51Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:51 crc kubenswrapper[4749]: E0128 18:35:51.910991 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67fbc4d2-6dc5-4849-8697-e9f75acd0da5\\\",\\\"systemUUID\\\":\\\"8a21bb4d-a47c-473a-90f1-3a8d36045974\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.914448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.914667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.914755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.914850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.914916 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:51Z","lastTransitionTime":"2026-01-28T18:35:51Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:51 crc kubenswrapper[4749]: E0128 18:35:51.925486 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67fbc4d2-6dc5-4849-8697-e9f75acd0da5\\\",\\\"systemUUID\\\":\\\"8a21bb4d-a47c-473a-90f1-3a8d36045974\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.929454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.929501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.929514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.929536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.929548 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:51Z","lastTransitionTime":"2026-01-28T18:35:51Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:51 crc kubenswrapper[4749]: E0128 18:35:51.941858 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:51Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"67fbc4d2-6dc5-4849-8697-e9f75acd0da5\\\",\\\"systemUUID\\\":\\\"8a21bb4d-a47c-473a-90f1-3a8d36045974\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:51 crc kubenswrapper[4749]: E0128 18:35:51.942067 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.943640 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.943676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.943688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.943709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:51 crc kubenswrapper[4749]: I0128 18:35:51.943722 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:51Z","lastTransitionTime":"2026-01-28T18:35:51Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.046346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.046384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.046397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.046418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.046430 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.148219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.148260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.148271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.148289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.148298 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.153678 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.158523 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.250966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.251000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.251008 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.251025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.251036 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.353296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.353355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.353368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.353389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.353402 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.456129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.456171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.456183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.456203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.456217 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.558315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.558375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.558386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.558406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.558417 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.660526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.660559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.660568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.660583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.660593 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.762746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.762788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.762799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.762817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.762826 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.817240 4749 apiserver.go:52] "Watching apiserver" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.819971 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.820248 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.820634 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.820689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.820745 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.820772 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.820777 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.820860 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.820909 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.821373 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.821289 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.822770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.823060 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.824146 4749 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.824636 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.825054 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.825091 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.826641 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.826647 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.826770 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.826867 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.835484 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 09:30:37.774084179 +0000 UTC Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.847360 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.860646 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d46ae1a-7005-4413-8fd3-5c7ef768cefd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T18:35:46Z\\\",\\\"message\\\":\\\"W0128 18:35:35.998870 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0128 18:35:35.999282 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769625335 cert, and key in /tmp/serving-cert-2868254237/serving-signer.crt, /tmp/serving-cert-2868254237/serving-signer.key\\\\nI0128 18:35:36.331173 1 observer_polling.go:159] Starting file observer\\\\nW0128 18:35:36.338187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0128 18:35:36.338386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 18:35:36.339841 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2868254237/tls.crt::/tmp/serving-cert-2868254237/tls.key\\\\\\\"\\\\nF0128 18:35:46.558913 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T18:35:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.865112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.865143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.865152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.865167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.865176 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868288 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868350 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868397 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868418 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868462 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868487 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868534 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868576 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868599 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868619 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868641 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868742 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868737 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868764 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868830 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.868983 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869121 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869149 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869183 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869230 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869259 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869274 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869290 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869473 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869469 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869489 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869563 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869594 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869611 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869709 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869733 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869750 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869767 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869764 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869801 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869834 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869884 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.869998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870114 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870163 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870230 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870365 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870386 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870426 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870448 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870561 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870608 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870799 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870898 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870924 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870949 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871065 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871089 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871132 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871154 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871200 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871223 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871245 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871290 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871313 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871388 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871414 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871675 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871749 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871773 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871821 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871846 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871871 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871895 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871918 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872003 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872109 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872209 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872233 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872309 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872353 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872380 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870043 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872509 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870255 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870281 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872565 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872640 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872737 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872842 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872866 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872891 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872939 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872960 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873030 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873052 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873212 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873235 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873795 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873812 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873839 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873891 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873910 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874030 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874220 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874312 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874345 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874363 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874376 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874388 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874401 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874414 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874427 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874441 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874453 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874467 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874481 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874495 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.875357 4749 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.886653 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d46ae1a-7005-4413-8fd3-5c7ef768cefd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T18:35:46Z\\\",\\\"message\\\":\\\"W0128 18:35:35.998870 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0128 18:35:35.999282 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769625335 cert, and key in /tmp/serving-cert-2868254237/serving-signer.crt, /tmp/serving-cert-2868254237/serving-signer.key\\\\nI0128 18:35:36.331173 1 observer_polling.go:159] Starting file observer\\\\nW0128 18:35:36.338187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0128 18:35:36.338386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 18:35:36.339841 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2868254237/tls.crt::/tmp/serving-cert-2868254237/tls.key\\\\\\\"\\\\nF0128 18:35:46.558913 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T18:35:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870563 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870840 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870841 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893724 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.870967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871981 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.871998 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872409 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872434 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872489 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872574 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.872960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893798 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873822 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873843 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873854 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874876 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874946 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.874991 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.875019 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.875279 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.875492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.875606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.875929 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893878 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.879467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.879787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.879821 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.879879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.880515 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.880567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.886535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.886565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.887163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.887280 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.887511 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.887556 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.887688 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.887767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888209 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888534 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888603 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888641 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888701 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893996 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888835 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.888902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.889186 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.889337 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.889493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.889679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.889790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.889930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.890100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.890586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.890620 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.890788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.890858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.891023 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.891057 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.891212 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.891311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.891880 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892169 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892175 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892554 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892706 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.892983 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893028 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893185 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893367 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893369 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893507 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893614 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893740 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.873352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.876005 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.894231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.894682 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.894712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.895668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.895675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.895922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896018 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.891915 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896138 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896192 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896448 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896693 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896865 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893266 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.896995 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897012 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893640 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897054 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897076 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897085 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897346 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897427 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897461 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897470 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897569 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.895383 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897814 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897836 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.897957 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898022 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.895557 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.893908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898269 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898497 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.898658 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.898723 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:53.398707906 +0000 UTC m=+21.410234681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898737 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898870 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.898921 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899143 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899194 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899205 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899207 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899609 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899627 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899648 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899854 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.899653 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.900203 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.900318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.900495 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.900571 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.900696 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.900858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.901405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.901477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.901488 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.901951 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.905071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.905139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.905163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.906073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.906104 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.906521 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.907042 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.907494 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.907530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.907644 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.907817 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:53.407802223 +0000 UTC m=+21.419329078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.907924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.908080 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.908307 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:35:53.408296665 +0000 UTC m=+21.419823560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.908680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.908913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.909954 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.913133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.924446 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.924603 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.924616 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.924628 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.924679 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:53.424664684 +0000 UTC m=+21.436191459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.927007 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.927030 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.927041 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.927082 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:53.427070103 +0000 UTC m=+21.438596878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.927221 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.934095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.935437 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.938685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.946562 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.948535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.957735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.958220 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.966620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.966668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.966677 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.966691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.966703 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:52Z","lastTransitionTime":"2026-01-28T18:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.969130 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: E0128 18:35:52.970497 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.975621 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976082 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976207 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976220 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976230 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976238 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976247 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976256 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976264 4749 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976271 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976280 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976290 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976298 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976306 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976313 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976339 4749 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976352 4749 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976379 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976429 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976443 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976453 4749 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976462 4749 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976471 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976480 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976489 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976515 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976523 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976531 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976539 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976547 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976555 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976563 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976571 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976579 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976589 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976597 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976607 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976619 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976629 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976638 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976647 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976658 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976668 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976678 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976694 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976705 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976713 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976722 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976731 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976740 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976747 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976756 4749 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976765 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976775 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976782 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976790 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976799 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976806 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976814 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976822 4749 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976830 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976839 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976847 4749 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976859 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976868 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976877 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976885 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976894 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976902 4749 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976911 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976919 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976974 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976983 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.976991 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977000 4749 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977008 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977017 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977009 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977026 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977054 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977066 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977074 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977083 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977092 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977101 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977110 4749 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977118 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977127 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977136 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977144 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977154 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977162 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977170 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977178 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977188 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977197 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977208 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977215 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977223 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977231 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977239 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977247 4749 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977255 4749 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977264 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977272 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977280 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977288 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977296 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977305 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977314 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977339 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977348 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977356 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977365 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977374 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977382 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977391 4749 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977399 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977407 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977416 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977424 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977433 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977441 4749 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977450 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977459 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977467 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977475 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977483 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977493 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977500 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977509 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977517 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977524 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977533 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977541 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977549 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977556 4749 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977564 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977571 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977579 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977588 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977597 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977605 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977620 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977628 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977636 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977643 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977651 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977659 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977667 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977675 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977683 4749 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977691 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977698 4749 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977706 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977714 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977721 4749 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977729 4749 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977737 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977744 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977754 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977762 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977770 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977777 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977784 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977792 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977799 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977807 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977818 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977825 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977833 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977841 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977848 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977856 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977864 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977871 4749 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977878 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977886 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977894 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977902 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977910 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977918 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977928 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.977938 4749 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.988165 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:52 crc kubenswrapper[4749]: I0128 18:35:52.996967 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.006421 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.013791 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.024066 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.032746 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.040637 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.050244 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d46ae1a-7005-4413-8fd3-5c7ef768cefd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T18:35:46Z\\\",\\\"message\\\":\\\"W0128 18:35:35.998870 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0128 18:35:35.999282 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769625335 cert, and key in /tmp/serving-cert-2868254237/serving-signer.crt, /tmp/serving-cert-2868254237/serving-signer.key\\\\nI0128 18:35:36.331173 1 observer_polling.go:159] Starting file observer\\\\nW0128 18:35:36.338187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0128 18:35:36.338386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 18:35:36.339841 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2868254237/tls.crt::/tmp/serving-cert-2868254237/tls.key\\\\\\\"\\\\nF0128 18:35:46.558913 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T18:35:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.069144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.069183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.069195 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.069211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.069223 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:53Z","lastTransitionTime":"2026-01-28T18:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.134829 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.148113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.148156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 18:35:53 crc kubenswrapper[4749]: W0128 18:35:53.149562 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-5b051979f6446604f145e18e6e52a4c9ba2c2854e2411fd43e5c5073d8d43c0b WatchSource:0}: Error finding container 5b051979f6446604f145e18e6e52a4c9ba2c2854e2411fd43e5c5073d8d43c0b: Status 404 returned error can't find the container with id 5b051979f6446604f145e18e6e52a4c9ba2c2854e2411fd43e5c5073d8d43c0b Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.170949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.170991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.171002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.171017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.171029 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:53Z","lastTransitionTime":"2026-01-28T18:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.273174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.273228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.273238 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.273253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.273262 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:53Z","lastTransitionTime":"2026-01-28T18:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.375378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.375428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.375436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.375451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.375459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:53Z","lastTransitionTime":"2026-01-28T18:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.477777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.477818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.477828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.477842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.477851 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:53Z","lastTransitionTime":"2026-01-28T18:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.482032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.482104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.482127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.482146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.482164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482256 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482269 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482279 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482314 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:54.482302836 +0000 UTC m=+22.493829601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482379 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:35:54.482372458 +0000 UTC m=+22.493899233 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482418 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482439 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:54.482433819 +0000 UTC m=+22.493960594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482464 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482480 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:54.48247554 +0000 UTC m=+22.494002315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482593 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482653 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482667 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.482729 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:54.482711886 +0000 UTC m=+22.494238661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.580400 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.580444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.580454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.580470 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.580479 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:53Z","lastTransitionTime":"2026-01-28T18:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.682572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.682614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.682626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.682641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.682653 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:53Z","lastTransitionTime":"2026-01-28T18:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.954682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.954881 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.955141 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 01:35:16.520255379 +0000 UTC Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.956520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.956550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.956561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.956579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.956592 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:53Z","lastTransitionTime":"2026-01-28T18:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.961505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:53 crc kubenswrapper[4749]: E0128 18:35:53.961659 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.970932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5b051979f6446604f145e18e6e52a4c9ba2c2854e2411fd43e5c5073d8d43c0b"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.971867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5abee4b5f7ef84c964207043a9705149e472f1204711d8bc356a03db80515ae6"} Jan 28 18:35:53 crc kubenswrapper[4749]: I0128 18:35:53.973414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4802ff06fc789df16344807f86e3cfbf37c0ff6793236f807bd99ec15d66a1ec"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.059601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.059639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.059648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.059663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.059674 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.162212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.162259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.162273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.162294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.162307 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.265511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.265582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.265592 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.265606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.265614 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.368381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.368426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.368434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.368448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.368457 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.471183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.471242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.471253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.471274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.471288 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.561296 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.561386 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.561413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.561433 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561469 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:35:56.561446429 +0000 UTC m=+24.572973194 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.561509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561543 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561557 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561556 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561591 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561610 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561619 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:56.561601993 +0000 UTC m=+24.573128838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561623 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561639 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561649 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:56.561636014 +0000 UTC m=+24.573162859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561670 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:56.561662184 +0000 UTC m=+24.573189049 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561568 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.561708 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:56.561701365 +0000 UTC m=+24.573228220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.573605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.573653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.573662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.573677 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.573687 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.676284 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.676350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.676360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.676379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.676391 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.778868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.778925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.778944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.778970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.778987 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.871312 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:35:54 crc kubenswrapper[4749]: E0128 18:35:54.871479 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.875903 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.876445 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.877702 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.878345 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.879251 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.879854 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.880430 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.881479 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.881535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.881592 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.881605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.881627 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.881642 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.882086 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.882997 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.883492 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.884594 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.885101 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.885739 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.886719 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.887205 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.888370 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.888818 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.889495 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.890650 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.891099 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.892041 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.892477 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.893501 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.893886 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.894494 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.895802 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.896283 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.897476 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.898120 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.899182 4749 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.899317 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.901017 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.902011 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.902515 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.904897 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.905833 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.908259 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.909590 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.911129 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.913600 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.915056 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.916830 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.917642 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.918216 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.918835 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.919365 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.920477 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.920981 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.923285 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.924016 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.925004 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.926590 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.927320 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.955263 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 16:37:02.808885165 +0000 UTC Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.977522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e41ce1a3fbff495b0da98b6c784683249c9540d0362f2b6e9edc7912b5ffd4a3"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.978999 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"84dabf80a1de0267cf9ad5b6b291373d9a7d3447d74a71910f10baf0e8dd6a12"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.979080 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"92cb35864941cefac7411364b73dffb196b63860d7c6d71c40b2e22553135159"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.983840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.983892 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.983905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.983922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.983949 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:54Z","lastTransitionTime":"2026-01-28T18:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:54 crc kubenswrapper[4749]: I0128 18:35:54.995637 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d46ae1a-7005-4413-8fd3-5c7ef768cefd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T18:35:46Z\\\",\\\"message\\\":\\\"W0128 18:35:35.998870 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0128 18:35:35.999282 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769625335 cert, and key in /tmp/serving-cert-2868254237/serving-signer.crt, /tmp/serving-cert-2868254237/serving-signer.key\\\\nI0128 18:35:36.331173 1 observer_polling.go:159] Starting file observer\\\\nW0128 18:35:36.338187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0128 18:35:36.338386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 18:35:36.339841 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2868254237/tls.crt::/tmp/serving-cert-2868254237/tls.key\\\\\\\"\\\\nF0128 18:35:46.558913 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T18:35:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:54Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.008379 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.019385 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.033891 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.045911 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.059418 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.073093 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41ce1a3fbff495b0da98b6c784683249c9540d0362f2b6e9edc7912b5ffd4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.086715 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dabf80a1de0267cf9ad5b6b291373d9a7d3447d74a71910f10baf0e8dd6a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92cb35864941cefac7411364b73dffb196b63860d7c6d71c40b2e22553135159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.086994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.087011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.087019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.087037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.087049 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.099363 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.117128 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.135662 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41ce1a3fbff495b0da98b6c784683249c9540d0362f2b6e9edc7912b5ffd4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.154244 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.170354 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d46ae1a-7005-4413-8fd3-5c7ef768cefd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T18:35:46Z\\\",\\\"message\\\":\\\"W0128 18:35:35.998870 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0128 18:35:35.999282 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769625335 cert, and key in /tmp/serving-cert-2868254237/serving-signer.crt, /tmp/serving-cert-2868254237/serving-signer.key\\\\nI0128 18:35:36.331173 1 observer_polling.go:159] Starting file observer\\\\nW0128 18:35:36.338187 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0128 18:35:36.338386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 18:35:36.339841 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2868254237/tls.crt::/tmp/serving-cert-2868254237/tls.key\\\\\\\"\\\\nF0128 18:35:46.558913 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T18:35:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T18:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T18:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T18:35:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.186440 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T18:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T18:35:55Z is after 2025-08-24T17:21:41Z" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.188988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.189037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.189047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.189061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.189070 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.292062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.292122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.292133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.292156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.292168 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.394355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.394410 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.394423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.394443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.394455 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.496557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.496623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.496633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.496655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.496668 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.598588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.598632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.598643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.598658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.598670 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.700647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.700706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.700721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.700738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.700751 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.803021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.803068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.803079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.803097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.803106 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.870639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.870666 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:55 crc kubenswrapper[4749]: E0128 18:35:55.870773 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:35:55 crc kubenswrapper[4749]: E0128 18:35:55.870911 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.905912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.905952 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.905964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.905979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.905988 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:55Z","lastTransitionTime":"2026-01-28T18:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:55 crc kubenswrapper[4749]: I0128 18:35:55.956106 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 19:41:54.64095101 +0000 UTC Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.008231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.008292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.008309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.008346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.008368 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.110712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.110756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.110767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.110783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.110796 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.215577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.215651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.215665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.215684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.215694 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.318749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.318812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.318831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.318854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.318870 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.421534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.421588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.421600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.421621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.421635 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.524048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.524109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.524121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.524143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.524159 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.577817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.577897 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.577921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.577942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.577972 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:00.577947325 +0000 UTC m=+28.589474100 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.578011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578056 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578078 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578092 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578136 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578159 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578178 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578189 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578147 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:00.578128139 +0000 UTC m=+28.589654994 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578227 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:00.578216921 +0000 UTC m=+28.589743696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578237 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:00.578233032 +0000 UTC m=+28.589759807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578289 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.578506 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:00.578477708 +0000 UTC m=+28.590004523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.626920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.626958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.626967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.626982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.626991 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.729091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.729119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.729126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.729139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.729148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.768416 4749 csr.go:261] certificate signing request csr-jq9pc is approved, waiting to be issued Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.800179 4749 csr.go:257] certificate signing request csr-jq9pc is issued Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.831684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.831727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.831739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.831757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.831771 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.871307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.871474 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.935510 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.935551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.935559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.935575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.935585 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:56Z","lastTransitionTime":"2026-01-28T18:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.956649 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 15:42:03.202054711 +0000 UTC Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.961139 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bb9jw"] Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.961435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bb9jw" Jan 28 18:35:56 crc kubenswrapper[4749]: W0128 18:35:56.962632 4749 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Jan 28 18:35:56 crc kubenswrapper[4749]: W0128 18:35:56.962651 4749 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.962701 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.962666 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:56 crc kubenswrapper[4749]: W0128 18:35:56.962878 4749 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Jan 28 18:35:56 crc kubenswrapper[4749]: E0128 18:35:56.962905 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.980390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6418fcdb-072f-43dd-b11c-d8835fb6f5f6-hosts-file\") pod \"node-resolver-bb9jw\" (UID: \"6418fcdb-072f-43dd-b11c-d8835fb6f5f6\") " pod="openshift-dns/node-resolver-bb9jw" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.980444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnglp\" (UniqueName: \"kubernetes.io/projected/6418fcdb-072f-43dd-b11c-d8835fb6f5f6-kube-api-access-qnglp\") pod \"node-resolver-bb9jw\" (UID: \"6418fcdb-072f-43dd-b11c-d8835fb6f5f6\") " pod="openshift-dns/node-resolver-bb9jw" Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.984597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1160f7e1034afc8dba7662b981b07c92a0a5f12619ea97eaa728c7f4e78ffdc1"} Jan 28 18:35:56 crc kubenswrapper[4749]: I0128 18:35:56.992448 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=4.992433153 podStartE2EDuration="4.992433153s" podCreationTimestamp="2026-01-28 18:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:35:56.977176522 +0000 UTC m=+24.988703307" watchObservedRunningTime="2026-01-28 18:35:56.992433153 +0000 UTC m=+25.003959928" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.000944 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-z47j6"] Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.001501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.003182 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.003666 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.004270 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.004635 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.037361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.037406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.037419 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.037433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.037444 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.054577 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-z8jvg"] Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.054929 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: W0128 18:35:57.057357 4749 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.057406 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:57 crc kubenswrapper[4749]: W0128 18:35:57.057464 4749 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.057479 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:57 crc kubenswrapper[4749]: W0128 18:35:57.057516 4749 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.057529 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.058415 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.058733 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnglp\" (UniqueName: \"kubernetes.io/projected/6418fcdb-072f-43dd-b11c-d8835fb6f5f6-kube-api-access-qnglp\") pod \"node-resolver-bb9jw\" (UID: \"6418fcdb-072f-43dd-b11c-d8835fb6f5f6\") " pod="openshift-dns/node-resolver-bb9jw" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-system-cni-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-k8s-cni-cncf-io\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-etc-kubernetes\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zkzd\" (UniqueName: \"kubernetes.io/projected/77c0ba3f-b876-47af-8f7e-b66e632b74cb-kube-api-access-6zkzd\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-daemon-config\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-cnibin\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081212 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-hostroot\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6418fcdb-072f-43dd-b11c-d8835fb6f5f6-hosts-file\") pod \"node-resolver-bb9jw\" (UID: \"6418fcdb-072f-43dd-b11c-d8835fb6f5f6\") " pod="openshift-dns/node-resolver-bb9jw" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77c0ba3f-b876-47af-8f7e-b66e632b74cb-serviceca\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081298 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-os-release\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-cni-binary-copy\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-cni-multus\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77c0ba3f-b876-47af-8f7e-b66e632b74cb-host\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-netns\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6418fcdb-072f-43dd-b11c-d8835fb6f5f6-hosts-file\") pod \"node-resolver-bb9jw\" (UID: \"6418fcdb-072f-43dd-b11c-d8835fb6f5f6\") " pod="openshift-dns/node-resolver-bb9jw" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-cni-bin\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-multus-certs\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-cni-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-socket-dir-parent\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d62x9\" (UniqueName: \"kubernetes.io/projected/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-kube-api-access-d62x9\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-kubelet\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.081675 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-conf-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.089855 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-698zt"] Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.090263 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.093405 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.093807 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.095319 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.095718 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.099002 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.122118 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-bgqhl"] Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.122854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.125264 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.125768 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.140113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.140153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.140163 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.140179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.140191 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-multus-certs\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-cni-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182081 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-socket-dir-parent\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1841c82d-7cd1-4c14-b54d-794bbb647776-mcd-auth-proxy-config\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182141 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-conf-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-k8s-cni-cncf-io\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-multus-certs\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-k8s-cni-cncf-io\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-socket-dir-parent\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-system-cni-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-conf-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182372 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-cni-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-cnibin\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-system-cni-dir\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-hostroot\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182527 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-hostroot\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlrld\" (UniqueName: \"kubernetes.io/projected/201c8cd8-38ab-4afd-9542-2f745acb02e6-kube-api-access-vlrld\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182584 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-cnibin\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-os-release\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-cni-multus\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182649 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-os-release\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77c0ba3f-b876-47af-8f7e-b66e632b74cb-host\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-cni-multus\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-netns\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-run-netns\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-cni-bin\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-cnibin\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d62x9\" (UniqueName: \"kubernetes.io/projected/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-kube-api-access-d62x9\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182819 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77c0ba3f-b876-47af-8f7e-b66e632b74cb-host\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183032 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-system-cni-dir\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.182911 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-cni-bin\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183067 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-kubelet\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1841c82d-7cd1-4c14-b54d-794bbb647776-rootfs\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-host-var-lib-kubelet\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183172 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-os-release\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-etc-kubernetes\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183268 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zkzd\" (UniqueName: \"kubernetes.io/projected/77c0ba3f-b876-47af-8f7e-b66e632b74cb-kube-api-access-6zkzd\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-daemon-config\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183312 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-etc-kubernetes\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnxp9\" (UniqueName: \"kubernetes.io/projected/1841c82d-7cd1-4c14-b54d-794bbb647776-kube-api-access-gnxp9\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/201c8cd8-38ab-4afd-9542-2f745acb02e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183431 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1841c82d-7cd1-4c14-b54d-794bbb647776-proxy-tls\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77c0ba3f-b876-47af-8f7e-b66e632b74cb-serviceca\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-cni-binary-copy\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.183578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/201c8cd8-38ab-4afd-9542-2f745acb02e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.184040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-multus-daemon-config\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.185723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77c0ba3f-b876-47af-8f7e-b66e632b74cb-serviceca\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.208003 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wzvwl"] Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.208741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: W0128 18:35:57.210953 4749 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 28 18:35:57 crc kubenswrapper[4749]: W0128 18:35:57.210969 4749 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.210995 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.211021 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:57 crc kubenswrapper[4749]: W0128 18:35:57.210956 4749 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.211053 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.212155 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.212197 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.212617 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.213163 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.217883 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zkzd\" (UniqueName: \"kubernetes.io/projected/77c0ba3f-b876-47af-8f7e-b66e632b74cb-kube-api-access-6zkzd\") pod \"node-ca-z47j6\" (UID: \"77c0ba3f-b876-47af-8f7e-b66e632b74cb\") " pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.223686 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.230080 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.242853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.242908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.242918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.242955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.242966 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.253652 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/201c8cd8-38ab-4afd-9542-2f745acb02e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-ovn-kubernetes\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-config\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284173 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-netns\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-ovn\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1841c82d-7cd1-4c14-b54d-794bbb647776-mcd-auth-proxy-config\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-systemd\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284340 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-etc-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284365 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284402 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-slash\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlrld\" (UniqueName: \"kubernetes.io/projected/201c8cd8-38ab-4afd-9542-2f745acb02e6-kube-api-access-vlrld\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-node-log\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-system-cni-dir\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-cnibin\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284489 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1841c82d-7cd1-4c14-b54d-794bbb647776-rootfs\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-kubelet\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-systemd-units\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284556 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-env-overrides\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fjp9\" (UniqueName: \"kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284596 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-netd\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/290d31fb-b204-4b3c-84ea-a5d597748b18-ovn-node-metrics-cert\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-os-release\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnxp9\" (UniqueName: \"kubernetes.io/projected/1841c82d-7cd1-4c14-b54d-794bbb647776-kube-api-access-gnxp9\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284672 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-var-lib-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1841c82d-7cd1-4c14-b54d-794bbb647776-proxy-tls\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-log-socket\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-bin\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/201c8cd8-38ab-4afd-9542-2f745acb02e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284753 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-script-lib\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1841c82d-7cd1-4c14-b54d-794bbb647776-rootfs\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284944 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1841c82d-7cd1-4c14-b54d-794bbb647776-mcd-auth-proxy-config\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.284982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/201c8cd8-38ab-4afd-9542-2f745acb02e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.285021 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-system-cni-dir\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.285027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-os-release\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.285174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-cnibin\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.289973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1841c82d-7cd1-4c14-b54d-794bbb647776-proxy-tls\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.297289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/201c8cd8-38ab-4afd-9542-2f745acb02e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.309839 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnxp9\" (UniqueName: \"kubernetes.io/projected/1841c82d-7cd1-4c14-b54d-794bbb647776-kube-api-access-gnxp9\") pod \"machine-config-daemon-698zt\" (UID: \"1841c82d-7cd1-4c14-b54d-794bbb647776\") " pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.313694 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z47j6" Jan 28 18:35:57 crc kubenswrapper[4749]: W0128 18:35:57.324020 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77c0ba3f_b876_47af_8f7e_b66e632b74cb.slice/crio-016f8e16e2acd077f6cbd055fec0b4d7dab64dac751445bbd751caaa01bcb05e WatchSource:0}: Error finding container 016f8e16e2acd077f6cbd055fec0b4d7dab64dac751445bbd751caaa01bcb05e: Status 404 returned error can't find the container with id 016f8e16e2acd077f6cbd055fec0b4d7dab64dac751445bbd751caaa01bcb05e Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.345543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.345585 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.345597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.345612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.345624 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.368937 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-thsj6"] Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.369419 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.369488 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thsj6" podUID="513ef52a-4532-409e-b188-5101ab5a3fff" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386177 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-slash\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-node-log\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-kubelet\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-systemd-units\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-env-overrides\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-node-log\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386348 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-systemd-units\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-slash\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-kubelet\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fjp9\" (UniqueName: \"kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/290d31fb-b204-4b3c-84ea-a5d597748b18-ovn-node-metrics-cert\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-netd\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-var-lib-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vbg5\" (UniqueName: \"kubernetes.io/projected/513ef52a-4532-409e-b188-5101ab5a3fff-kube-api-access-2vbg5\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-netd\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386550 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-log-socket\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-bin\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-var-lib-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-log-socket\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-script-lib\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-bin\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-ovn-kubernetes\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-config\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386724 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-netns\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-ovn\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-ovn-kubernetes\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-netns\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-systemd\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386825 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-ovn\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-etc-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-systemd\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-etc-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.386943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-openvswitch\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.387366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-script-lib\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.387414 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-config\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.391918 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/290d31fb-b204-4b3c-84ea-a5d597748b18-ovn-node-metrics-cert\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.402395 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.407520 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.407502995 podStartE2EDuration="407.502995ms" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:35:57.407495375 +0000 UTC m=+25.419022160" watchObservedRunningTime="2026-01-28 18:35:57.407502995 +0000 UTC m=+25.419029770" Jan 28 18:35:57 crc kubenswrapper[4749]: W0128 18:35:57.414121 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1841c82d_7cd1_4c14_b54d_794bbb647776.slice/crio-e0a591aa3bde2929c585ef3a8b7cd9c8eeedf2d56ef432bf9767f865454ea035 WatchSource:0}: Error finding container e0a591aa3bde2929c585ef3a8b7cd9c8eeedf2d56ef432bf9767f865454ea035: Status 404 returned error can't find the container with id e0a591aa3bde2929c585ef3a8b7cd9c8eeedf2d56ef432bf9767f865454ea035 Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.448949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.448987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.448997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.449011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.449020 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.487721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.487815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbg5\" (UniqueName: \"kubernetes.io/projected/513ef52a-4532-409e-b188-5101ab5a3fff-kube-api-access-2vbg5\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.487893 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.487962 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs podName:513ef52a-4532-409e-b188-5101ab5a3fff nodeName:}" failed. No retries permitted until 2026-01-28 18:35:57.987943854 +0000 UTC m=+25.999470629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs") pod "network-metrics-daemon-thsj6" (UID: "513ef52a-4532-409e-b188-5101ab5a3fff") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.551031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.551074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.551087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.551108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.551124 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.557375 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b"] Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.558078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.560875 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.562060 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.588274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxrtk\" (UniqueName: \"kubernetes.io/projected/da399eec-26dd-47ee-8c37-559aae03dcf3-kube-api-access-rxrtk\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.588362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da399eec-26dd-47ee-8c37-559aae03dcf3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.588382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da399eec-26dd-47ee-8c37-559aae03dcf3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.588429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da399eec-26dd-47ee-8c37-559aae03dcf3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.652772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.652801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.652811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.652824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.652833 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.689268 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da399eec-26dd-47ee-8c37-559aae03dcf3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.689359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da399eec-26dd-47ee-8c37-559aae03dcf3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.689433 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da399eec-26dd-47ee-8c37-559aae03dcf3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.689502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxrtk\" (UniqueName: \"kubernetes.io/projected/da399eec-26dd-47ee-8c37-559aae03dcf3-kube-api-access-rxrtk\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.690188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da399eec-26dd-47ee-8c37-559aae03dcf3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.693083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da399eec-26dd-47ee-8c37-559aae03dcf3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.755432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.755472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.755481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.755496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.755506 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.801155 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-28 18:30:56 +0000 UTC, rotation deadline is 2026-11-18 02:48:58.570201301 +0000 UTC Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.801242 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7040h13m0.768963086s for next certificate rotation Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.859206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.859246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.859255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.859271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.859283 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.870639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.870756 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.870824 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.870967 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.906205 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.955766 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.957020 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 17:45:04.171013034 +0000 UTC Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.957143 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.962082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.962125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.962133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.962157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.962168 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:57Z","lastTransitionTime":"2026-01-28T18:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.965904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/201c8cd8-38ab-4afd-9542-2f745acb02e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.965932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-cni-binary-copy\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.989929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"98e5bb244bc2927e83fff562a68e7cf378c6be756f33b6d22c16c73bc074e76f"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.990020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"4776e4bbf405860b0865ff2250d0eef5141f9c0b4c47049e6cb2d3cde9522949"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.990030 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"e0a591aa3bde2929c585ef3a8b7cd9c8eeedf2d56ef432bf9767f865454ea035"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.992374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z47j6" event={"ID":"77c0ba3f-b876-47af-8f7e-b66e632b74cb","Type":"ContainerStarted","Data":"2405ad7feabda732859d16972ac842822c656ea74aeb8578f1aef689d439a350"} Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.992498 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.992415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z47j6" event={"ID":"77c0ba3f-b876-47af-8f7e-b66e632b74cb","Type":"ContainerStarted","Data":"016f8e16e2acd077f6cbd055fec0b4d7dab64dac751445bbd751caaa01bcb05e"} Jan 28 18:35:57 crc kubenswrapper[4749]: I0128 18:35:57.992374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:57 crc kubenswrapper[4749]: E0128 18:35:57.992585 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs podName:513ef52a-4532-409e-b188-5101ab5a3fff nodeName:}" failed. No retries permitted until 2026-01-28 18:35:58.992553282 +0000 UTC m=+27.004080057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs") pod "network-metrics-daemon-thsj6" (UID: "513ef52a-4532-409e-b188-5101ab5a3fff") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.007000 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podStartSLOduration=1.006978782 podStartE2EDuration="1.006978782s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:35:58.006673665 +0000 UTC m=+26.018200440" watchObservedRunningTime="2026-01-28 18:35:58.006978782 +0000 UTC m=+26.018505557" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.063973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.064021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.064033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.064084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.064099 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.126919 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.130808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da399eec-26dd-47ee-8c37-559aae03dcf3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.137681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-env-overrides\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.166404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.166453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.166467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.166486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.166497 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.212964 4749 projected.go:288] Couldn't get configMap openshift-multus/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.213005 4749 projected.go:194] Error preparing data for projected volume kube-api-access-d62x9 for pod openshift-multus/multus-z8jvg: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.213075 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-kube-api-access-d62x9 podName:bb6b5d81-5370-425d-a3f6-ebfc447a3d27 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:58.713055978 +0000 UTC m=+26.724582753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-d62x9" (UniqueName: "kubernetes.io/projected/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-kube-api-access-d62x9") pod "multus-z8jvg" (UID: "bb6b5d81-5370-425d-a3f6-ebfc447a3d27") : failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.269594 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.270279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.270310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.270343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.270362 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.270376 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.281183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnglp\" (UniqueName: \"kubernetes.io/projected/6418fcdb-072f-43dd-b11c-d8835fb6f5f6-kube-api-access-qnglp\") pod \"node-resolver-bb9jw\" (UID: \"6418fcdb-072f-43dd-b11c-d8835fb6f5f6\") " pod="openshift-dns/node-resolver-bb9jw" Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.298854 4749 projected.go:288] Couldn't get configMap openshift-multus/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.298912 4749 projected.go:194] Error preparing data for projected volume kube-api-access-vlrld for pod openshift-multus/multus-additional-cni-plugins-bgqhl: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.299020 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/201c8cd8-38ab-4afd-9542-2f745acb02e6-kube-api-access-vlrld podName:201c8cd8-38ab-4afd-9542-2f745acb02e6 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:58.798991383 +0000 UTC m=+26.810518158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vlrld" (UniqueName: "kubernetes.io/projected/201c8cd8-38ab-4afd-9542-2f745acb02e6-kube-api-access-vlrld") pod "multus-additional-cni-plugins-bgqhl" (UID: "201c8cd8-38ab-4afd-9542-2f745acb02e6") : failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.302542 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.315155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vbg5\" (UniqueName: \"kubernetes.io/projected/513ef52a-4532-409e-b188-5101ab5a3fff-kube-api-access-2vbg5\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.372397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.372438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.372454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.372470 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.372481 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.414254 4749 projected.go:288] Couldn't get configMap openshift-ovn-kubernetes/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.414310 4749 projected.go:194] Error preparing data for projected volume kube-api-access-8fjp9 for pod openshift-ovn-kubernetes/ovnkube-node-wzvwl: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.414404 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9 podName:290d31fb-b204-4b3c-84ea-a5d597748b18 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:58.914382514 +0000 UTC m=+26.925909289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8fjp9" (UniqueName: "kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9") pod "ovnkube-node-wzvwl" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18") : failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.444751 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.472681 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bb9jw" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.475113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.475253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.475348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.475451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.475558 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.488880 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.579045 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.579371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.579383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.579397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.579433 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.681308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.681361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.681370 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.681384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.681394 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.704038 4749 projected.go:288] Couldn't get configMap openshift-ovn-kubernetes/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.704112 4749 projected.go:194] Error preparing data for projected volume kube-api-access-rxrtk for pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.704187 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da399eec-26dd-47ee-8c37-559aae03dcf3-kube-api-access-rxrtk podName:da399eec-26dd-47ee-8c37-559aae03dcf3 nodeName:}" failed. No retries permitted until 2026-01-28 18:35:59.204157218 +0000 UTC m=+27.215683993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rxrtk" (UniqueName: "kubernetes.io/projected/da399eec-26dd-47ee-8c37-559aae03dcf3-kube-api-access-rxrtk") pod "ovnkube-control-plane-749d76644c-nkh7b" (UID: "da399eec-26dd-47ee-8c37-559aae03dcf3") : failed to sync configmap cache: timed out waiting for the condition Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.773109 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.783665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.783705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.783715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.783730 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.783739 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.800458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlrld\" (UniqueName: \"kubernetes.io/projected/201c8cd8-38ab-4afd-9542-2f745acb02e6-kube-api-access-vlrld\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.800506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d62x9\" (UniqueName: \"kubernetes.io/projected/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-kube-api-access-d62x9\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.804632 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlrld\" (UniqueName: \"kubernetes.io/projected/201c8cd8-38ab-4afd-9542-2f745acb02e6-kube-api-access-vlrld\") pod \"multus-additional-cni-plugins-bgqhl\" (UID: \"201c8cd8-38ab-4afd-9542-2f745acb02e6\") " pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.805520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d62x9\" (UniqueName: \"kubernetes.io/projected/bb6b5d81-5370-425d-a3f6-ebfc447a3d27-kube-api-access-d62x9\") pod \"multus-z8jvg\" (UID: \"bb6b5d81-5370-425d-a3f6-ebfc447a3d27\") " pod="openshift-multus/multus-z8jvg" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.869619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z8jvg" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.870382 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.870382 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.870519 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thsj6" podUID="513ef52a-4532-409e-b188-5101ab5a3fff" Jan 28 18:35:58 crc kubenswrapper[4749]: E0128 18:35:58.870609 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:35:58 crc kubenswrapper[4749]: W0128 18:35:58.880088 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb6b5d81_5370_425d_a3f6_ebfc447a3d27.slice/crio-25e02ce31b8391d46bc6397ed7281628e56b92c4b64eeb921b4ce112ce4f616f WatchSource:0}: Error finding container 25e02ce31b8391d46bc6397ed7281628e56b92c4b64eeb921b4ce112ce4f616f: Status 404 returned error can't find the container with id 25e02ce31b8391d46bc6397ed7281628e56b92c4b64eeb921b4ce112ce4f616f Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.886068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.886143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.886158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.886172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.886182 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.933822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" Jan 28 18:35:58 crc kubenswrapper[4749]: W0128 18:35:58.950903 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod201c8cd8_38ab_4afd_9542_2f745acb02e6.slice/crio-5336612a8f6783c18e00746c5fc755a4a1ff57bd714bdbf0d12483a14d3f066f WatchSource:0}: Error finding container 5336612a8f6783c18e00746c5fc755a4a1ff57bd714bdbf0d12483a14d3f066f: Status 404 returned error can't find the container with id 5336612a8f6783c18e00746c5fc755a4a1ff57bd714bdbf0d12483a14d3f066f Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.957588 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 05:26:52.132716865 +0000 UTC Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.989755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.989797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.989806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.989821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.989831 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:58Z","lastTransitionTime":"2026-01-28T18:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.996146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerStarted","Data":"5336612a8f6783c18e00746c5fc755a4a1ff57bd714bdbf0d12483a14d3f066f"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.997500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z8jvg" event={"ID":"bb6b5d81-5370-425d-a3f6-ebfc447a3d27","Type":"ContainerStarted","Data":"25e02ce31b8391d46bc6397ed7281628e56b92c4b64eeb921b4ce112ce4f616f"} Jan 28 18:35:58 crc kubenswrapper[4749]: I0128 18:35:58.998994 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bb9jw" event={"ID":"6418fcdb-072f-43dd-b11c-d8835fb6f5f6","Type":"ContainerStarted","Data":"2fd792c722df723698a436a3834ddc84c497d8963b0143d97aee457c5191bf4d"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.001832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.001937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fjp9\" (UniqueName: \"kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:59 crc kubenswrapper[4749]: E0128 18:35:59.002042 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:35:59 crc kubenswrapper[4749]: E0128 18:35:59.002099 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs podName:513ef52a-4532-409e-b188-5101ab5a3fff nodeName:}" failed. No retries permitted until 2026-01-28 18:36:01.002081947 +0000 UTC m=+29.013608722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs") pod "network-metrics-daemon-thsj6" (UID: "513ef52a-4532-409e-b188-5101ab5a3fff") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.007492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fjp9\" (UniqueName: \"kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9\") pod \"ovnkube-node-wzvwl\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.030135 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.093242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.093678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.093692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.093708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.093725 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.195971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.196012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.196025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.196043 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.196055 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.298741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.298781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.298791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.298804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.298814 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.305577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxrtk\" (UniqueName: \"kubernetes.io/projected/da399eec-26dd-47ee-8c37-559aae03dcf3-kube-api-access-rxrtk\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.311641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxrtk\" (UniqueName: \"kubernetes.io/projected/da399eec-26dd-47ee-8c37-559aae03dcf3-kube-api-access-rxrtk\") pod \"ovnkube-control-plane-749d76644c-nkh7b\" (UID: \"da399eec-26dd-47ee-8c37-559aae03dcf3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.376078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.401762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.401802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.401811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.401826 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.401837 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: W0128 18:35:59.408445 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda399eec_26dd_47ee_8c37_559aae03dcf3.slice/crio-3c3502ed5ada29592861abaac8eee803c5ac68782d0ad98c822acce015e2c698 WatchSource:0}: Error finding container 3c3502ed5ada29592861abaac8eee803c5ac68782d0ad98c822acce015e2c698: Status 404 returned error can't find the container with id 3c3502ed5ada29592861abaac8eee803c5ac68782d0ad98c822acce015e2c698 Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.505398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.505426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.505435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.505448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.505459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.608645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.608963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.608975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.608993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.609005 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.711388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.711424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.711431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.711444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.711453 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.813918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.813960 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.813970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.813986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.813995 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.871235 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.871292 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:35:59 crc kubenswrapper[4749]: E0128 18:35:59.871381 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:35:59 crc kubenswrapper[4749]: E0128 18:35:59.871427 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.916189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.916236 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.916247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.916262 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.916275 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:35:59Z","lastTransitionTime":"2026-01-28T18:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:35:59 crc kubenswrapper[4749]: I0128 18:35:59.958458 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 16:40:31.784990238 +0000 UTC Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.003435 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906" exitCode=0 Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.003495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.003522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"fdc77ac81bee03866949dd3e860090a1775d47ebce943b27146d074b8d25bdfc"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.006141 4749 generic.go:334] "Generic (PLEG): container finished" podID="201c8cd8-38ab-4afd-9542-2f745acb02e6" containerID="931203fa6b760a5ac213ddc4e47719958197b71ad1925ff587f454c9ad4f842a" exitCode=0 Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.006222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerDied","Data":"931203fa6b760a5ac213ddc4e47719958197b71ad1925ff587f454c9ad4f842a"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.008295 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" event={"ID":"da399eec-26dd-47ee-8c37-559aae03dcf3","Type":"ContainerStarted","Data":"52b94b10956eebdd7bfc0f0b55b44e36c0dcaf978e0c6c75ca23639323e86221"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.008354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" event={"ID":"da399eec-26dd-47ee-8c37-559aae03dcf3","Type":"ContainerStarted","Data":"ead1db525d0c1844e5afc5f34b32fc2b083820b06bc161a2e0c5daa15af7c971"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.008368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" event={"ID":"da399eec-26dd-47ee-8c37-559aae03dcf3","Type":"ContainerStarted","Data":"3c3502ed5ada29592861abaac8eee803c5ac68782d0ad98c822acce015e2c698"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.009784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bb9jw" event={"ID":"6418fcdb-072f-43dd-b11c-d8835fb6f5f6","Type":"ContainerStarted","Data":"9484ad437be4c6600ec081fcba1a0bcc82c10c585d1805f092291963e399a67f"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.011162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z8jvg" event={"ID":"bb6b5d81-5370-425d-a3f6-ebfc447a3d27","Type":"ContainerStarted","Data":"daf25062d286e005e2e33e89ad24c0e39c08bb54adb935c08e1d664dc39d1a98"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.018633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.018668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.018680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.018696 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.018710 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.028980 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z47j6" podStartSLOduration=4.028964625 podStartE2EDuration="4.028964625s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:35:58.018079359 +0000 UTC m=+26.029606144" watchObservedRunningTime="2026-01-28 18:36:00.028964625 +0000 UTC m=+28.040491400" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.043123 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nkh7b" podStartSLOduration=3.043106588 podStartE2EDuration="3.043106588s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:00.042425901 +0000 UTC m=+28.053952696" watchObservedRunningTime="2026-01-28 18:36:00.043106588 +0000 UTC m=+28.054633363" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.081046 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bb9jw" podStartSLOduration=4.0810178950000005 podStartE2EDuration="4.081017895s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:00.079743603 +0000 UTC m=+28.091270398" watchObservedRunningTime="2026-01-28 18:36:00.081017895 +0000 UTC m=+28.092544670" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.098697 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z8jvg" podStartSLOduration=3.098677395 podStartE2EDuration="3.098677395s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:00.098577462 +0000 UTC m=+28.110104257" watchObservedRunningTime="2026-01-28 18:36:00.098677395 +0000 UTC m=+28.110204170" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.122205 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.122255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.122267 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.122291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.122308 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.225440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.225484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.225495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.225510 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.225521 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.329049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.329100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.329113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.329128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.329138 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.435121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.435166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.435179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.435194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.435205 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.537254 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.537296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.537307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.537321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.537354 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.618197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618389 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:08.6183577 +0000 UTC m=+36.629884475 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.618482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.618519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.618577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.618605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618650 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618715 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:08.618695419 +0000 UTC m=+36.630222264 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618725 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618766 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:08.61875573 +0000 UTC m=+36.630282515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618837 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618859 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618895 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618913 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618998 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:08.618962775 +0000 UTC m=+36.630489580 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.618871 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.619028 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.619057 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:08.619048807 +0000 UTC m=+36.630575602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.639812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.639850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.639859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.639874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.639890 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.742925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.742973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.742981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.743001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.743010 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.845558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.845594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.845604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.845618 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.845628 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.870830 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.870845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.870971 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thsj6" podUID="513ef52a-4532-409e-b188-5101ab5a3fff" Jan 28 18:36:00 crc kubenswrapper[4749]: E0128 18:36:00.871079 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.947459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.947490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.947498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.947512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.947521 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:00Z","lastTransitionTime":"2026-01-28T18:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:00 crc kubenswrapper[4749]: I0128 18:36:00.958882 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 05:11:36.941437746 +0000 UTC Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.015275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.015778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.017381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerStarted","Data":"b0b43964fa9dd455525bb689b09b418306377c4898d752fd0fe91937e23196ed"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.021473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:01 crc kubenswrapper[4749]: E0128 18:36:01.021567 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:36:01 crc kubenswrapper[4749]: E0128 18:36:01.021610 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs podName:513ef52a-4532-409e-b188-5101ab5a3fff nodeName:}" failed. No retries permitted until 2026-01-28 18:36:05.021595718 +0000 UTC m=+33.033122493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs") pod "network-metrics-daemon-thsj6" (UID: "513ef52a-4532-409e-b188-5101ab5a3fff") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.050141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.050175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.050184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.050201 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.050210 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.153286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.153341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.153384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.153420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.153432 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.255863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.255914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.255927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.255944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.255987 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.361509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.361548 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.361558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.361573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.361584 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.463872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.463919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.463929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.463944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.463955 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.566632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.566677 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.566688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.566706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.566717 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.669241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.669278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.669290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.669305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.669317 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.771673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.771718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.771729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.771745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.771757 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.871861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:01 crc kubenswrapper[4749]: E0128 18:36:01.872020 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.872053 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:01 crc kubenswrapper[4749]: E0128 18:36:01.872210 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.874441 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.874471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.874479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.874493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.874505 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.960124 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:44:06.607523461 +0000 UTC Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.976737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.976769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.976780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.976796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:01 crc kubenswrapper[4749]: I0128 18:36:01.976807 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:01Z","lastTransitionTime":"2026-01-28T18:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.023260 4749 generic.go:334] "Generic (PLEG): container finished" podID="201c8cd8-38ab-4afd-9542-2f745acb02e6" containerID="b0b43964fa9dd455525bb689b09b418306377c4898d752fd0fe91937e23196ed" exitCode=0 Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.023309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerDied","Data":"b0b43964fa9dd455525bb689b09b418306377c4898d752fd0fe91937e23196ed"} Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.030266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.030301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.030311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.030337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.030348 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T18:36:02Z","lastTransitionTime":"2026-01-28T18:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.033276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4"} Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.033314 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1"} Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.033350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e"} Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.033363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7"} Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.079792 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244"] Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.080531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.082025 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.082052 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.082636 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.083508 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.130928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e9a8cb7d-b3cd-4282-abec-05824405d7de-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.130978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a8cb7d-b3cd-4282-abec-05824405d7de-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.131006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e9a8cb7d-b3cd-4282-abec-05824405d7de-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.131024 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9a8cb7d-b3cd-4282-abec-05824405d7de-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.131279 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e9a8cb7d-b3cd-4282-abec-05824405d7de-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.232320 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e9a8cb7d-b3cd-4282-abec-05824405d7de-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.232393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a8cb7d-b3cd-4282-abec-05824405d7de-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.232413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e9a8cb7d-b3cd-4282-abec-05824405d7de-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.232432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9a8cb7d-b3cd-4282-abec-05824405d7de-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.232485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e9a8cb7d-b3cd-4282-abec-05824405d7de-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.232521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e9a8cb7d-b3cd-4282-abec-05824405d7de-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.232517 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e9a8cb7d-b3cd-4282-abec-05824405d7de-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.233403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9a8cb7d-b3cd-4282-abec-05824405d7de-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.237581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a8cb7d-b3cd-4282-abec-05824405d7de-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.252153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e9a8cb7d-b3cd-4282-abec-05824405d7de-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd244\" (UID: \"e9a8cb7d-b3cd-4282-abec-05824405d7de\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.392359 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.686810 4749 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 28 18:36:02 crc kubenswrapper[4749]: W0128 18:36:02.687383 4749 reflector.go:484] object-"openshift-cluster-version"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-cluster-version"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 28 18:36:02 crc kubenswrapper[4749]: W0128 18:36:02.687736 4749 reflector.go:484] object-"openshift-cluster-version"/"cluster-version-operator-serving-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-cluster-version"/"cluster-version-operator-serving-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 28 18:36:02 crc kubenswrapper[4749]: W0128 18:36:02.688014 4749 reflector.go:484] object-"openshift-cluster-version"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-cluster-version"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 28 18:36:02 crc kubenswrapper[4749]: W0128 18:36:02.688039 4749 reflector.go:484] object-"openshift-cluster-version"/"default-dockercfg-gxtc4": watch of *v1.Secret ended with: very short watch: object-"openshift-cluster-version"/"default-dockercfg-gxtc4": Unexpected watch close - watch lasted less than a second and no items received Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.870599 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.870632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:02 crc kubenswrapper[4749]: E0128 18:36:02.871701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:36:02 crc kubenswrapper[4749]: E0128 18:36:02.871915 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thsj6" podUID="513ef52a-4532-409e-b188-5101ab5a3fff" Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.960952 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 17:48:53.947935662 +0000 UTC Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.961057 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 28 18:36:02 crc kubenswrapper[4749]: I0128 18:36:02.968390 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.038441 4749 generic.go:334] "Generic (PLEG): container finished" podID="201c8cd8-38ab-4afd-9542-2f745acb02e6" containerID="f1f34cf9787e829570ea3a4eea5ddbec0dd9172e162cc496c52d66d0cfa818cf" exitCode=0 Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.038556 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerDied","Data":"f1f34cf9787e829570ea3a4eea5ddbec0dd9172e162cc496c52d66d0cfa818cf"} Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.041951 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" event={"ID":"e9a8cb7d-b3cd-4282-abec-05824405d7de","Type":"ContainerStarted","Data":"497a8e86f73b6aab77f112fdaf2b40daff9ee9e4ee6525670fbb6a4fe11338d8"} Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.042004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" event={"ID":"e9a8cb7d-b3cd-4282-abec-05824405d7de","Type":"ContainerStarted","Data":"adc93f0e28802c2752801fea32513d72015d7a0c2cca0e79e38cda4c46957d0d"} Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.079298 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd244" podStartSLOduration=7.079279421 podStartE2EDuration="7.079279421s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:03.078298466 +0000 UTC m=+31.089825261" watchObservedRunningTime="2026-01-28 18:36:03.079279421 +0000 UTC m=+31.090806186" Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.503174 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.587550 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.761103 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.870445 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:03 crc kubenswrapper[4749]: I0128 18:36:03.870462 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:03 crc kubenswrapper[4749]: E0128 18:36:03.870571 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:36:03 crc kubenswrapper[4749]: E0128 18:36:03.870708 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:36:04 crc kubenswrapper[4749]: I0128 18:36:04.019435 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 28 18:36:04 crc kubenswrapper[4749]: I0128 18:36:04.047718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397"} Jan 28 18:36:04 crc kubenswrapper[4749]: I0128 18:36:04.049829 4749 generic.go:334] "Generic (PLEG): container finished" podID="201c8cd8-38ab-4afd-9542-2f745acb02e6" containerID="5b0424b0b6fb824d8e945ce07917522bf746c566ec6f410b9bfbce0042c593a0" exitCode=0 Jan 28 18:36:04 crc kubenswrapper[4749]: I0128 18:36:04.049856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerDied","Data":"5b0424b0b6fb824d8e945ce07917522bf746c566ec6f410b9bfbce0042c593a0"} Jan 28 18:36:04 crc kubenswrapper[4749]: I0128 18:36:04.207904 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:36:04 crc kubenswrapper[4749]: I0128 18:36:04.870996 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:04 crc kubenswrapper[4749]: I0128 18:36:04.871054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:04 crc kubenswrapper[4749]: E0128 18:36:04.871147 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:36:04 crc kubenswrapper[4749]: E0128 18:36:04.871263 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thsj6" podUID="513ef52a-4532-409e-b188-5101ab5a3fff" Jan 28 18:36:05 crc kubenswrapper[4749]: I0128 18:36:05.055188 4749 generic.go:334] "Generic (PLEG): container finished" podID="201c8cd8-38ab-4afd-9542-2f745acb02e6" containerID="1f354f6c9dc5936117d88434fe3af2445405dd4ed6104cf97a0dfd22e48b2a86" exitCode=0 Jan 28 18:36:05 crc kubenswrapper[4749]: I0128 18:36:05.055232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerDied","Data":"1f354f6c9dc5936117d88434fe3af2445405dd4ed6104cf97a0dfd22e48b2a86"} Jan 28 18:36:05 crc kubenswrapper[4749]: I0128 18:36:05.062255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:05 crc kubenswrapper[4749]: E0128 18:36:05.062437 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:36:05 crc kubenswrapper[4749]: E0128 18:36:05.062522 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs podName:513ef52a-4532-409e-b188-5101ab5a3fff nodeName:}" failed. No retries permitted until 2026-01-28 18:36:13.062498206 +0000 UTC m=+41.074025041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs") pod "network-metrics-daemon-thsj6" (UID: "513ef52a-4532-409e-b188-5101ab5a3fff") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 18:36:05 crc kubenswrapper[4749]: I0128 18:36:05.870863 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:05 crc kubenswrapper[4749]: I0128 18:36:05.870904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:05 crc kubenswrapper[4749]: E0128 18:36:05.871007 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:36:05 crc kubenswrapper[4749]: E0128 18:36:05.871096 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:36:06 crc kubenswrapper[4749]: I0128 18:36:06.061906 4749 generic.go:334] "Generic (PLEG): container finished" podID="201c8cd8-38ab-4afd-9542-2f745acb02e6" containerID="141df075da5721635c436bec23ef1c12a2765b35f57a2bc69ce6859af35f177e" exitCode=0 Jan 28 18:36:06 crc kubenswrapper[4749]: I0128 18:36:06.061979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerDied","Data":"141df075da5721635c436bec23ef1c12a2765b35f57a2bc69ce6859af35f177e"} Jan 28 18:36:06 crc kubenswrapper[4749]: I0128 18:36:06.071751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerStarted","Data":"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc"} Jan 28 18:36:06 crc kubenswrapper[4749]: I0128 18:36:06.871145 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:06 crc kubenswrapper[4749]: I0128 18:36:06.871203 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:06 crc kubenswrapper[4749]: E0128 18:36:06.871605 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thsj6" podUID="513ef52a-4532-409e-b188-5101ab5a3fff" Jan 28 18:36:06 crc kubenswrapper[4749]: E0128 18:36:06.871862 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.078738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" event={"ID":"201c8cd8-38ab-4afd-9542-2f745acb02e6","Type":"ContainerStarted","Data":"ed42acc5502d953d04ea84ce931aa3e72dd7650283fa079acbcdc28c23e69e0f"} Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.079189 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.079233 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.101056 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bgqhl" podStartSLOduration=10.101037062 podStartE2EDuration="10.101037062s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:07.100657801 +0000 UTC m=+35.112184596" watchObservedRunningTime="2026-01-28 18:36:07.101037062 +0000 UTC m=+35.112563837" Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.104907 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.104971 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.125124 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podStartSLOduration=10.125108702 podStartE2EDuration="10.125108702s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:07.124860066 +0000 UTC m=+35.136386851" watchObservedRunningTime="2026-01-28 18:36:07.125108702 +0000 UTC m=+35.136635477" Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.870660 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:07 crc kubenswrapper[4749]: E0128 18:36:07.870819 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:36:07 crc kubenswrapper[4749]: I0128 18:36:07.871276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:07 crc kubenswrapper[4749]: E0128 18:36:07.871369 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.081936 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.253735 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-thsj6"] Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.253831 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.253905 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thsj6" podUID="513ef52a-4532-409e-b188-5101ab5a3fff" Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.625044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.625155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.625178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.625213 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.625234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625356 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625371 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625381 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625419 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:24.625408059 +0000 UTC m=+52.636934834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625727 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:24.625718116 +0000 UTC m=+52.637244891 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625813 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625829 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625839 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625865 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:24.625856541 +0000 UTC m=+52.637383316 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625900 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625921 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625945 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:24.625933832 +0000 UTC m=+52.637460607 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.625960 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:24.625952683 +0000 UTC m=+52.637479458 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.752923 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:36:08 crc kubenswrapper[4749]: I0128 18:36:08.870596 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:08 crc kubenswrapper[4749]: E0128 18:36:08.870725 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:36:09 crc kubenswrapper[4749]: I0128 18:36:09.871289 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:09 crc kubenswrapper[4749]: I0128 18:36:09.871289 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:09 crc kubenswrapper[4749]: E0128 18:36:09.871499 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 18:36:09 crc kubenswrapper[4749]: E0128 18:36:09.871524 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 18:36:09 crc kubenswrapper[4749]: I0128 18:36:09.871312 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:09 crc kubenswrapper[4749]: E0128 18:36:09.871613 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thsj6" podUID="513ef52a-4532-409e-b188-5101ab5a3fff" Jan 28 18:36:10 crc kubenswrapper[4749]: I0128 18:36:10.102252 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:36:10 crc kubenswrapper[4749]: I0128 18:36:10.870904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:10 crc kubenswrapper[4749]: E0128 18:36:10.871088 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.629703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.629880 4749 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.698279 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.698665 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gtqfj"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.698882 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mg8d"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.699192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.699518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.699805 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.700605 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.700962 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.701148 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqtw5"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.701453 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.701471 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.701806 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.702972 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.703159 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.703630 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9dfj"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.704003 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kx92t"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.704511 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.705270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.737805 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.739930 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.761051 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.761055 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.761130 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.761101 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.762077 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.762093 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.762740 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.762908 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.762990 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.763151 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.763196 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.763455 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.763465 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.763670 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.763719 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.764024 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.768297 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.768297 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.769750 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.769749 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.770122 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.770157 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.774607 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.776002 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.784313 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.784379 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.786009 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789802 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789823 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789860 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789887 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789904 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789931 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789939 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789977 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.789939 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.790209 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.791935 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.792098 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.792240 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.792443 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.792627 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.792759 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.793126 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.795382 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.795404 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.796970 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.798022 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dhdg9"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.799391 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.802027 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.802427 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.802516 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.804721 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.826802 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.827099 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.827174 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829144 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829189 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829287 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829426 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829514 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829538 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829148 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829641 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829797 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8tmng"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.829305 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.832959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.833817 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.836675 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.837028 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.837241 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.837266 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-kjr8m"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.837363 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.837686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.837890 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.838249 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pdnfq"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.838501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.838543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.838656 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.838507 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.839164 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.839434 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.839563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.839662 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.839959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.840057 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.840189 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.846581 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.847001 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.847229 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.847916 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.848162 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.848282 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.850713 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.853617 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-r2995"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.854991 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.858824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6d2\" (UniqueName: \"kubernetes.io/projected/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-kube-api-access-9r6d2\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.858891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.858927 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-serving-cert\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.858943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcsgk\" (UniqueName: \"kubernetes.io/projected/f1f9743c-bcb4-47c0-8c0d-23530ecca520-kube-api-access-rcsgk\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-encryption-config\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859110 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-client-ca\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-etcd-client\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859155 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsj8f\" (UniqueName: \"kubernetes.io/projected/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-kube-api-access-xsj8f\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-config\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-serving-cert\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859273 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1f9743c-bcb4-47c0-8c0d-23530ecca520-audit-dir\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-audit-policies\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859312 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-config\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859366 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859388 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxnn2\" (UniqueName: \"kubernetes.io/projected/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-kube-api-access-bxnn2\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9af4e08d-4ce1-404c-b962-bb53ed89552c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae92b3db-2f69-410c-9cb0-6383fe6343ba-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-serving-cert\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-audit-dir\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-dir\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-serving-cert\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8k54\" (UniqueName: \"kubernetes.io/projected/f172bf9b-7eb3-46b1-8a40-9e566b01b433-kube-api-access-s8k54\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkm2t\" (UniqueName: \"kubernetes.io/projected/ae92b3db-2f69-410c-9cb0-6383fe6343ba-kube-api-access-zkm2t\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae92b3db-2f69-410c-9cb0-6383fe6343ba-images\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef9d771b-78e9-4131-afe5-f1f90025783e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wtt5v\" (UID: \"ef9d771b-78e9-4131-afe5-f1f90025783e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-etcd-serving-ca\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f78dk\" (UniqueName: \"kubernetes.io/projected/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-kube-api-access-f78dk\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859860 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-config\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.859963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860113 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860254 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-etcd-client\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860343 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb5km\" (UniqueName: \"kubernetes.io/projected/9af4e08d-4ce1-404c-b962-bb53ed89552c-kube-api-access-lb5km\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-image-import-ca\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-encryption-config\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af4e08d-4ce1-404c-b962-bb53ed89552c-serving-cert\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-policies\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860586 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-config\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860619 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860686 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1f9743c-bcb4-47c0-8c0d-23530ecca520-node-pullsecrets\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-audit\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtqph\" (UniqueName: \"kubernetes.io/projected/ef9d771b-78e9-4131-afe5-f1f90025783e-kube-api-access-xtqph\") pod \"cluster-samples-operator-665b6dd947-wtt5v\" (UID: \"ef9d771b-78e9-4131-afe5-f1f90025783e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-client-ca\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk7kb\" (UniqueName: \"kubernetes.io/projected/4f3a2df0-0830-4e78-a168-31171cf06b76-kube-api-access-gk7kb\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860788 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae92b3db-2f69-410c-9cb0-6383fe6343ba-config\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860821 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.860896 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3a2df0-0830-4e78-a168-31171cf06b76-serving-cert\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.862084 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.862235 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.862440 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.863579 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.863784 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.863945 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.864073 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.888009 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.888822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.889986 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.889508 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.890299 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.890503 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.889639 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.889702 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.889990 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.890054 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.890072 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.890201 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.889593 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.890820 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.892045 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.892092 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.899127 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.899159 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.899179 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.899258 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.899269 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.899515 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.899574 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.899656 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.900046 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.900144 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.900377 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.900437 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.900526 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.900686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.900812 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.900891 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.908743 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.909714 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.914807 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.915730 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6lqrz"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.916099 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.916479 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.918135 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.918658 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k59mg"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.918970 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.919672 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.919946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.920001 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5hd"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.920311 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.920555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.920725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.920765 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.920882 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.921032 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.921063 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.921093 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tzdvz"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.922674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.925051 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.926968 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.928558 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.929833 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vddcm"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.930364 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.930897 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.931770 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.932638 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.932918 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.934339 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z74vl"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.935154 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.935730 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mg8d"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.936830 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-tms2k"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.937438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.938107 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gtqfj"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.943849 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqtw5"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.945316 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.946888 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.947352 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.953439 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9dfj"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.962417 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.962752 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-srv-cert\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.962864 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89mw6\" (UniqueName: \"kubernetes.io/projected/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-kube-api-access-89mw6\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.962952 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dxnj\" (UniqueName: \"kubernetes.io/projected/c23f9f6a-3b73-4429-8f26-a8c5e79963c9-kube-api-access-6dxnj\") pod \"package-server-manager-789f6589d5-lbzfp\" (UID: \"c23f9f6a-3b73-4429-8f26-a8c5e79963c9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-apiservice-cert\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bd5855-6284-402f-ad55-dd7ba2817439-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-console-config\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntkf\" (UniqueName: \"kubernetes.io/projected/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-kube-api-access-wntkf\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdlh\" (UniqueName: \"kubernetes.io/projected/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-kube-api-access-hqdlh\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-serving-cert\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae92b3db-2f69-410c-9cb0-6383fe6343ba-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-trusted-ca-bundle\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-serving-cert\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-audit-dir\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-dir\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8k54\" (UniqueName: \"kubernetes.io/projected/f172bf9b-7eb3-46b1-8a40-9e566b01b433-kube-api-access-s8k54\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkm2t\" (UniqueName: \"kubernetes.io/projected/ae92b3db-2f69-410c-9cb0-6383fe6343ba-kube-api-access-zkm2t\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae92b3db-2f69-410c-9cb0-6383fe6343ba-images\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5954\" (UniqueName: \"kubernetes.io/projected/2b018047-b659-49f1-a494-aaf29e2925e3-kube-api-access-r5954\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-audit-dir\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b018047-b659-49f1-a494-aaf29e2925e3-config-volume\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.963981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hmh\" (UniqueName: \"kubernetes.io/projected/e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b-kube-api-access-c4hmh\") pod \"multus-admission-controller-857f4d67dd-r2995\" (UID: \"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef9d771b-78e9-4131-afe5-f1f90025783e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wtt5v\" (UID: \"ef9d771b-78e9-4131-afe5-f1f90025783e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs92j\" (UniqueName: \"kubernetes.io/projected/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-kube-api-access-rs92j\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-oauth-serving-cert\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2ttr\" (UniqueName: \"kubernetes.io/projected/86bd5855-6284-402f-ad55-dd7ba2817439-kube-api-access-b2ttr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-etcd-serving-ca\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964180 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-oauth-config\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964205 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-auth-proxy-config\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-config\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964273 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f78dk\" (UniqueName: \"kubernetes.io/projected/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-kube-api-access-f78dk\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c23f9f6a-3b73-4429-8f26-a8c5e79963c9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lbzfp\" (UID: \"c23f9f6a-3b73-4429-8f26-a8c5e79963c9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964397 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-dir\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-machine-approver-tls\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a9afb6a-713a-476f-9be4-84eabb0905de-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964466 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b018047-b659-49f1-a494-aaf29e2925e3-secret-volume\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-r2995\" (UID: \"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964507 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-etcd-client\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb5km\" (UniqueName: \"kubernetes.io/projected/9af4e08d-4ce1-404c-b962-bb53ed89552c-kube-api-access-lb5km\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5m55\" (UniqueName: \"kubernetes.io/projected/9ed379e4-88c7-479e-9005-c7980ba50ccd-kube-api-access-z5m55\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-image-import-ca\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964684 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ed379e4-88c7-479e-9005-c7980ba50ccd-service-ca-bundle\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964710 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-default-certificate\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-metrics-certs\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-encryption-config\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964781 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-profile-collector-cert\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjdrp\" (UniqueName: \"kubernetes.io/projected/384d6d65-9777-47a4-bef0-dbeeb9959e66-kube-api-access-cjdrp\") pod \"downloads-7954f5f757-8tmng\" (UID: \"384d6d65-9777-47a4-bef0-dbeeb9959e66\") " pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxlp\" (UniqueName: \"kubernetes.io/projected/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-kube-api-access-7sxlp\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-config\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af4e08d-4ce1-404c-b962-bb53ed89552c-serving-cert\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964897 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-policies\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a9afb6a-713a-476f-9be4-84eabb0905de-config\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1f9743c-bcb4-47c0-8c0d-23530ecca520-node-pullsecrets\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.964986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965008 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-tmpfs\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-config\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965035 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-audit\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49l9q\" (UniqueName: \"kubernetes.io/projected/89157053-d5d1-40f0-8b36-411d637d8385-kube-api-access-49l9q\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-webhook-cert\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965206 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-stats-auth\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9afb6a-713a-476f-9be4-84eabb0905de-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtqph\" (UniqueName: \"kubernetes.io/projected/ef9d771b-78e9-4131-afe5-f1f90025783e-kube-api-access-xtqph\") pod \"cluster-samples-operator-665b6dd947-wtt5v\" (UID: \"ef9d771b-78e9-4131-afe5-f1f90025783e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-client-ca\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk7kb\" (UniqueName: \"kubernetes.io/projected/4f3a2df0-0830-4e78-a168-31171cf06b76-kube-api-access-gk7kb\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae92b3db-2f69-410c-9cb0-6383fe6343ba-config\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae92b3db-2f69-410c-9cb0-6383fe6343ba-images\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3a2df0-0830-4e78-a168-31171cf06b76-serving-cert\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-config\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-serving-cert\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965542 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcsgk\" (UniqueName: \"kubernetes.io/projected/f1f9743c-bcb4-47c0-8c0d-23530ecca520-kube-api-access-rcsgk\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6d2\" (UniqueName: \"kubernetes.io/projected/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-kube-api-access-9r6d2\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965579 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.965617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.967377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-audit\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.967504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.967624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-policies\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.968265 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-etcd-serving-ca\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.968433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1f9743c-bcb4-47c0-8c0d-23530ecca520-node-pullsecrets\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.968433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.968637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.969083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.969459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.969598 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3a2df0-0830-4e78-a168-31171cf06b76-serving-cert\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.969636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.970242 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-client-ca\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.970556 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.970555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.970843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.970889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.970956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae92b3db-2f69-410c-9cb0-6383fe6343ba-config\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.971589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae92b3db-2f69-410c-9cb0-6383fe6343ba-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.972107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.972542 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef9d771b-78e9-4131-afe5-f1f90025783e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wtt5v\" (UID: \"ef9d771b-78e9-4131-afe5-f1f90025783e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.972125 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-config\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.972831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-encryption-config\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.972919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-image-import-ca\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.973261 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kx92t"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.973493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-serving-cert\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.974310 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.974611 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dhdg9"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.975048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.975156 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-serving-cert\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.975352 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8tmng"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.975987 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bd5855-6284-402f-ad55-dd7ba2817439-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-encryption-config\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976071 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-client-ca\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-config\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-config\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-etcd-client\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsj8f\" (UniqueName: \"kubernetes.io/projected/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-kube-api-access-xsj8f\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-config\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1f9743c-bcb4-47c0-8c0d-23530ecca520-audit-dir\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-service-ca\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976268 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-serving-cert\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976307 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sfbj\" (UniqueName: \"kubernetes.io/projected/c85cc396-a32d-4118-8d79-fc50352372ba-kube-api-access-4sfbj\") pod \"migrator-59844c95c7-lwwzh\" (UID: \"c85cc396-a32d-4118-8d79-fc50352372ba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976340 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-config\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-audit-policies\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976445 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxnn2\" (UniqueName: \"kubernetes.io/projected/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-kube-api-access-bxnn2\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9af4e08d-4ce1-404c-b962-bb53ed89552c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-serving-cert\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-serving-cert\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-client-ca\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f9743c-bcb4-47c0-8c0d-23530ecca520-config\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.976877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.977420 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.977526 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.977703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1f9743c-bcb4-47c0-8c0d-23530ecca520-audit-dir\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.978118 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-config\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.978461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9af4e08d-4ce1-404c-b962-bb53ed89552c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.978910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.978936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-encryption-config\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.979281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.979485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-audit-policies\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.979504 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.981985 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.982084 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.982126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-serving-cert\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.981465 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.982384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1f9743c-bcb4-47c0-8c0d-23530ecca520-etcd-client\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.982516 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.983546 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.983663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.983837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.984179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af4e08d-4ce1-404c-b962-bb53ed89552c-serving-cert\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.984306 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kjr8m"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.985091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-etcd-client\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.987531 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.987571 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8p6sp"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.988133 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.988213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8p6sp" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.988344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-serving-cert\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.989180 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6xtcc"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.989803 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.990031 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xx5t5"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.990948 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.990950 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gz9f7"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.991621 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.991840 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-r2995"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.994749 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.996032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6lqrz"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.997148 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.998225 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5hd"] Jan 28 18:36:11 crc kubenswrapper[4749]: I0128 18:36:11.999281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xx5t5"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.000452 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.002958 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vddcm"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.004189 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.005489 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z74vl"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.006772 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.007484 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.008543 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tzdvz"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.009677 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.010629 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.011555 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k59mg"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.013885 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.015055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.016295 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gz9f7"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.017416 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.018576 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8p6sp"] Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.023075 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.042253 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.062657 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-oauth-config\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13b71160-3cba-4480-9180-22461548e389-signing-key\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077762 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a9afb6a-713a-476f-9be4-84eabb0905de-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077792 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43900a49-0d8f-48a8-b6af-385321464445-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-r2995\" (UID: \"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507e5f-704d-423c-a5dc-19d24cfb0014-config-volume\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13b71160-3cba-4480-9180-22461548e389-signing-cabundle\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-profile-collector-cert\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-default-certificate\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxlp\" (UniqueName: \"kubernetes.io/projected/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-kube-api-access-7sxlp\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.077997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-tmpfs\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-csi-data-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078078 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49l9q\" (UniqueName: \"kubernetes.io/projected/89157053-d5d1-40f0-8b36-411d637d8385-kube-api-access-49l9q\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-stats-auth\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078126 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw5lw\" (UniqueName: \"kubernetes.io/projected/2e43ff22-2928-48b1-b985-27926bcd5ef8-kube-api-access-dw5lw\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr5b7\" (UID: \"2e43ff22-2928-48b1-b985-27926bcd5ef8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/43900a49-0d8f-48a8-b6af-385321464445-ready\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-config\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078214 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-config\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078286 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-service-ca\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078312 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sfbj\" (UniqueName: \"kubernetes.io/projected/c85cc396-a32d-4118-8d79-fc50352372ba-kube-api-access-4sfbj\") pod \"migrator-59844c95c7-lwwzh\" (UID: \"c85cc396-a32d-4118-8d79-fc50352372ba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-serving-cert\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89mw6\" (UniqueName: \"kubernetes.io/projected/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-kube-api-access-89mw6\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bd5855-6284-402f-ad55-dd7ba2817439-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078466 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-srv-cert\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdlh\" (UniqueName: \"kubernetes.io/projected/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-kube-api-access-hqdlh\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-node-bootstrap-token\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5954\" (UniqueName: \"kubernetes.io/projected/2b018047-b659-49f1-a494-aaf29e2925e3-kube-api-access-r5954\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htn8p\" (UniqueName: \"kubernetes.io/projected/24507e5f-704d-423c-a5dc-19d24cfb0014-kube-api-access-htn8p\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-oauth-serving-cert\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6xps\" (UniqueName: \"kubernetes.io/projected/67719fcc-4750-4f5e-a48f-e51cd8580903-kube-api-access-c6xps\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2ttr\" (UniqueName: \"kubernetes.io/projected/86bd5855-6284-402f-ad55-dd7ba2817439-kube-api-access-b2ttr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7h8\" (UniqueName: \"kubernetes.io/projected/43900a49-0d8f-48a8-b6af-385321464445-kube-api-access-bf7h8\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078776 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-auth-proxy-config\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078801 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c23f9f6a-3b73-4429-8f26-a8c5e79963c9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lbzfp\" (UID: \"c23f9f6a-3b73-4429-8f26-a8c5e79963c9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-machine-approver-tls\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b018047-b659-49f1-a494-aaf29e2925e3-secret-volume\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.078976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ed379e4-88c7-479e-9005-c7980ba50ccd-service-ca-bundle\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5m55\" (UniqueName: \"kubernetes.io/projected/9ed379e4-88c7-479e-9005-c7980ba50ccd-kube-api-access-z5m55\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e43ff22-2928-48b1-b985-27926bcd5ef8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr5b7\" (UID: \"2e43ff22-2928-48b1-b985-27926bcd5ef8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj7jq\" (UniqueName: \"kubernetes.io/projected/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-kube-api-access-zj7jq\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-metrics-certs\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079176 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjdrp\" (UniqueName: \"kubernetes.io/projected/384d6d65-9777-47a4-bef0-dbeeb9959e66-kube-api-access-cjdrp\") pod \"downloads-7954f5f757-8tmng\" (UID: \"384d6d65-9777-47a4-bef0-dbeeb9959e66\") " pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079216 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-config\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-socket-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079297 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a9afb6a-713a-476f-9be4-84eabb0905de-config\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-webhook-cert\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079390 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9afb6a-713a-476f-9be4-84eabb0905de-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-registration-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-mountpoint-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079592 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27aff929-50f9-48fe-b978-cbc83c1d6c66-proxy-tls\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bd5855-6284-402f-ad55-dd7ba2817439-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27aff929-50f9-48fe-b978-cbc83c1d6c66-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-config\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-serving-cert\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dxnj\" (UniqueName: \"kubernetes.io/projected/c23f9f6a-3b73-4429-8f26-a8c5e79963c9-kube-api-access-6dxnj\") pod \"package-server-manager-789f6589d5-lbzfp\" (UID: \"c23f9f6a-3b73-4429-8f26-a8c5e79963c9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079730 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-tmpfs\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-apiservice-cert\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079770 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7hp\" (UniqueName: \"kubernetes.io/projected/13b71160-3cba-4480-9180-22461548e389-kube-api-access-6r7hp\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-srv-cert\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079814 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-plugins-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-console-config\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.079299 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bd5855-6284-402f-ad55-dd7ba2817439-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-service-ca\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wntkf\" (UniqueName: \"kubernetes.io/projected/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-kube-api-access-wntkf\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080357 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9cvh\" (UniqueName: \"kubernetes.io/projected/814398d6-5d05-4161-bd2e-3ff61d27f2c7-kube-api-access-z9cvh\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mvpg\" (UniqueName: \"kubernetes.io/projected/27aff929-50f9-48fe-b978-cbc83c1d6c66-kube-api-access-4mvpg\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-trusted-ca-bundle\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24507e5f-704d-423c-a5dc-19d24cfb0014-metrics-tls\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs92j\" (UniqueName: \"kubernetes.io/projected/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-kube-api-access-rs92j\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b018047-b659-49f1-a494-aaf29e2925e3-config-volume\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080520 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hmh\" (UniqueName: \"kubernetes.io/projected/e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b-kube-api-access-c4hmh\") pod \"multus-admission-controller-857f4d67dd-r2995\" (UID: \"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080540 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-certs\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.080560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.082216 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-oauth-serving-cert\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.082346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ed379e4-88c7-479e-9005-c7980ba50ccd-service-ca-bundle\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.082652 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.082899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-profile-collector-cert\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.083226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-config\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.083722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9afb6a-713a-476f-9be4-84eabb0905de-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.084106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-webhook-cert\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.084172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-default-certificate\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.084563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-trusted-ca-bundle\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.084811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-console-config\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.085211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c23f9f6a-3b73-4429-8f26-a8c5e79963c9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lbzfp\" (UID: \"c23f9f6a-3b73-4429-8f26-a8c5e79963c9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.085538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-srv-cert\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.085778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-apiservice-cert\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.085796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-serving-cert\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.085780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-serving-cert\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.086240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bd5855-6284-402f-ad55-dd7ba2817439-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.086375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-machine-approver-tls\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.086449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-metrics-certs\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.086835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9ed379e4-88c7-479e-9005-c7980ba50ccd-stats-auth\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.087856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b018047-b659-49f1-a494-aaf29e2925e3-secret-volume\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.088648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-oauth-config\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.091208 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a9afb6a-713a-476f-9be4-84eabb0905de-config\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.102566 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.111370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-auth-proxy-config\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.123434 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.141966 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.162253 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-mountpoint-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27aff929-50f9-48fe-b978-cbc83c1d6c66-proxy-tls\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27aff929-50f9-48fe-b978-cbc83c1d6c66-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7hp\" (UniqueName: \"kubernetes.io/projected/13b71160-3cba-4480-9180-22461548e389-kube-api-access-6r7hp\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-plugins-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9cvh\" (UniqueName: \"kubernetes.io/projected/814398d6-5d05-4161-bd2e-3ff61d27f2c7-kube-api-access-z9cvh\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181153 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvpg\" (UniqueName: \"kubernetes.io/projected/27aff929-50f9-48fe-b978-cbc83c1d6c66-kube-api-access-4mvpg\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24507e5f-704d-423c-a5dc-19d24cfb0014-metrics-tls\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-certs\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13b71160-3cba-4480-9180-22461548e389-signing-key\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181265 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43900a49-0d8f-48a8-b6af-385321464445-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507e5f-704d-423c-a5dc-19d24cfb0014-config-volume\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13b71160-3cba-4480-9180-22461548e389-signing-cabundle\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-csi-data-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw5lw\" (UniqueName: \"kubernetes.io/projected/2e43ff22-2928-48b1-b985-27926bcd5ef8-kube-api-access-dw5lw\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr5b7\" (UID: \"2e43ff22-2928-48b1-b985-27926bcd5ef8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/43900a49-0d8f-48a8-b6af-385321464445-ready\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-srv-cert\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-node-bootstrap-token\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htn8p\" (UniqueName: \"kubernetes.io/projected/24507e5f-704d-423c-a5dc-19d24cfb0014-kube-api-access-htn8p\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6xps\" (UniqueName: \"kubernetes.io/projected/67719fcc-4750-4f5e-a48f-e51cd8580903-kube-api-access-c6xps\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7h8\" (UniqueName: \"kubernetes.io/projected/43900a49-0d8f-48a8-b6af-385321464445-kube-api-access-bf7h8\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e43ff22-2928-48b1-b985-27926bcd5ef8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr5b7\" (UID: \"2e43ff22-2928-48b1-b985-27926bcd5ef8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj7jq\" (UniqueName: \"kubernetes.io/projected/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-kube-api-access-zj7jq\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.183292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-socket-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.183353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-registration-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.183403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.182553 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-plugins-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.183164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/43900a49-0d8f-48a8-b6af-385321464445-ready\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.183212 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.182395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-csi-data-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.183730 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-registration-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.181884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-mountpoint-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.183851 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/814398d6-5d05-4161-bd2e-3ff61d27f2c7-socket-dir\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.182357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43900a49-0d8f-48a8-b6af-385321464445-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.183239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27aff929-50f9-48fe-b978-cbc83c1d6c66-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.185793 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.191702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.202702 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.222566 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.242045 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.250669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-config\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.263289 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.282598 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.302400 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.304432 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b018047-b659-49f1-a494-aaf29e2925e3-config-volume\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.342770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.352040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-r2995\" (UID: \"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.362677 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.381648 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.401748 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.422120 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.441809 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.462362 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.481771 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.501178 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.514464 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.536571 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.541883 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.543683 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.561183 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.581856 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.601291 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.605917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-srv-cert\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.621474 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.625046 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27aff929-50f9-48fe-b978-cbc83c1d6c66-proxy-tls\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.641997 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.661894 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.680910 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.701272 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.722173 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.741636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.761324 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.767725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e43ff22-2928-48b1-b985-27926bcd5ef8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr5b7\" (UID: \"2e43ff22-2928-48b1-b985-27926bcd5ef8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.781367 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.801778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.822131 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.841242 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.862007 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.870845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.881892 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.902575 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.921483 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.940105 4749 request.go:700] Waited for 1.01827671s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.942645 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.961938 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 28 18:36:12 crc kubenswrapper[4749]: I0128 18:36:12.981925 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.003289 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.017221 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13b71160-3cba-4480-9180-22461548e389-signing-key\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.022519 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.033152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13b71160-3cba-4480-9180-22461548e389-signing-cabundle\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.041201 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.063381 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.083119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.096074 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.100022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513ef52a-4532-409e-b188-5101ab5a3fff-metrics-certs\") pod \"network-metrics-daemon-thsj6\" (UID: \"513ef52a-4532-409e-b188-5101ab5a3fff\") " pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.108448 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.121453 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.142383 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.163244 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182091 4749 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182446 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/24507e5f-704d-423c-a5dc-19d24cfb0014-config-volume podName:24507e5f-704d-423c-a5dc-19d24cfb0014 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:13.682420164 +0000 UTC m=+41.693946939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/24507e5f-704d-423c-a5dc-19d24cfb0014-config-volume") pod "dns-default-gz9f7" (UID: "24507e5f-704d-423c-a5dc-19d24cfb0014") : failed to sync configmap cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182566 4749 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182647 4749 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182684 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24507e5f-704d-423c-a5dc-19d24cfb0014-metrics-tls podName:24507e5f-704d-423c-a5dc-19d24cfb0014 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:13.68265279 +0000 UTC m=+41.694179555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/24507e5f-704d-423c-a5dc-19d24cfb0014-metrics-tls") pod "dns-default-gz9f7" (UID: "24507e5f-704d-423c-a5dc-19d24cfb0014") : failed to sync secret cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182576 4749 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182726 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-certs podName:67719fcc-4750-4f5e-a48f-e51cd8580903 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:13.682719092 +0000 UTC m=+41.694245867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-certs") pod "machine-config-server-6xtcc" (UID: "67719fcc-4750-4f5e-a48f-e51cd8580903") : failed to sync secret cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182708 4749 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182763 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-node-bootstrap-token podName:67719fcc-4750-4f5e-a48f-e51cd8580903 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:13.682733892 +0000 UTC m=+41.694260817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-node-bootstrap-token") pod "machine-config-server-6xtcc" (UID: "67719fcc-4750-4f5e-a48f-e51cd8580903") : failed to sync secret cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182780 4749 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182814 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-config podName:b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:13.682805204 +0000 UTC m=+41.694331979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" (UID: "b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2") : failed to sync configmap cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182582 4749 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182832 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-serving-cert podName:b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:13.682823644 +0000 UTC m=+41.694350419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" (UID: "b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2") : failed to sync secret cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: E0128 18:36:13.182881 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist podName:43900a49-0d8f-48a8-b6af-385321464445 nodeName:}" failed. No retries permitted until 2026-01-28 18:36:13.682856205 +0000 UTC m=+41.694382980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-tms2k" (UID: "43900a49-0d8f-48a8-b6af-385321464445") : failed to sync configmap cache: timed out waiting for the condition Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.184050 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.201434 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.222946 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.241914 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.262424 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.273310 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thsj6" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.282964 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.302780 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.323164 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.345358 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.362362 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.381616 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.402295 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.441754 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.443420 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.462954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.483029 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.501940 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.525383 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.543842 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.562969 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.578409 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-thsj6"] Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.589488 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.603079 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.622590 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.658360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8k54\" (UniqueName: \"kubernetes.io/projected/f172bf9b-7eb3-46b1-8a40-9e566b01b433-kube-api-access-s8k54\") pod \"oauth-openshift-558db77b4-r9dfj\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.681893 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb5km\" (UniqueName: \"kubernetes.io/projected/9af4e08d-4ce1-404c-b962-bb53ed89552c-kube-api-access-lb5km\") pod \"openshift-config-operator-7777fb866f-kx92t\" (UID: \"9af4e08d-4ce1-404c-b962-bb53ed89552c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.697212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcsgk\" (UniqueName: \"kubernetes.io/projected/f1f9743c-bcb4-47c0-8c0d-23530ecca520-kube-api-access-rcsgk\") pod \"apiserver-76f77b778f-dhdg9\" (UID: \"f1f9743c-bcb4-47c0-8c0d-23530ecca520\") " pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.710276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.710452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24507e5f-704d-423c-a5dc-19d24cfb0014-metrics-tls\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.710506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-certs\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.710532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.710849 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507e5f-704d-423c-a5dc-19d24cfb0014-config-volume\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.710965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.711397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.712290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.712452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-node-bootstrap-token\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.713799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.720112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk7kb\" (UniqueName: \"kubernetes.io/projected/4f3a2df0-0830-4e78-a168-31171cf06b76-kube-api-access-gk7kb\") pod \"controller-manager-879f6c89f-pqtw5\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.739208 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6d2\" (UniqueName: \"kubernetes.io/projected/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-kube-api-access-9r6d2\") pod \"route-controller-manager-6576b87f9c-8mq6z\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.760074 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkm2t\" (UniqueName: \"kubernetes.io/projected/ae92b3db-2f69-410c-9cb0-6383fe6343ba-kube-api-access-zkm2t\") pod \"machine-api-operator-5694c8668f-7mg8d\" (UID: \"ae92b3db-2f69-410c-9cb0-6383fe6343ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.776850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f78dk\" (UniqueName: \"kubernetes.io/projected/cec9a986-a9e9-4d5d-af46-c913af6cc8f7-kube-api-access-f78dk\") pod \"authentication-operator-69f744f599-gtqfj\" (UID: \"cec9a986-a9e9-4d5d-af46-c913af6cc8f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.796636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtqph\" (UniqueName: \"kubernetes.io/projected/ef9d771b-78e9-4131-afe5-f1f90025783e-kube-api-access-xtqph\") pod \"cluster-samples-operator-665b6dd947-wtt5v\" (UID: \"ef9d771b-78e9-4131-afe5-f1f90025783e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.816362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxnn2\" (UniqueName: \"kubernetes.io/projected/611afe15-fc0f-4d7f-a734-5ea2e4e0e591-kube-api-access-bxnn2\") pod \"apiserver-7bbb656c7d-j5jdv\" (UID: \"611afe15-fc0f-4d7f-a734-5ea2e4e0e591\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.820017 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.832282 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.839665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsj8f\" (UniqueName: \"kubernetes.io/projected/2fcec2bb-dbd8-4850-8b2f-0a64535bca7a-kube-api-access-xsj8f\") pod \"openshift-apiserver-operator-796bbdcf4f-47c58\" (UID: \"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.841926 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.844314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.853004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.862251 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.863666 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.872226 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.881002 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.884111 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.887663 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.907942 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.926906 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.929844 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.942137 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.947860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-certs\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.949803 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.960463 4749 request.go:700] Waited for 1.970437527s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.962830 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.980611 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/67719fcc-4750-4f5e-a48f-e51cd8580903-node-bootstrap-token\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:13 crc kubenswrapper[4749]: I0128 18:36:13.983724 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.003510 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.021884 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.043398 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.050978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507e5f-704d-423c-a5dc-19d24cfb0014-config-volume\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.063150 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.086411 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.096843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24507e5f-704d-423c-a5dc-19d24cfb0014-metrics-tls\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.106312 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-thsj6" event={"ID":"513ef52a-4532-409e-b188-5101ab5a3fff","Type":"ContainerStarted","Data":"9a7913cada5aecf223d8d70bc23a38248589f847d3584bd7a31c0a97d27381e9"} Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.106404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-thsj6" event={"ID":"513ef52a-4532-409e-b188-5101ab5a3fff","Type":"ContainerStarted","Data":"26ac475a32a5af65ccb5d55e0e1b34cfbe1662785334e0d086ca5424a9361ccd"} Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.141943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a9afb6a-713a-476f-9be4-84eabb0905de-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wx864\" (UID: \"7a9afb6a-713a-476f-9be4-84eabb0905de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.159681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxlp\" (UniqueName: \"kubernetes.io/projected/8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c-kube-api-access-7sxlp\") pod \"packageserver-d55dfcdfc-gzsjh\" (UID: \"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.189190 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42phc\" (UID: \"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.213493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdlh\" (UniqueName: \"kubernetes.io/projected/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-kube-api-access-hqdlh\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.225170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49l9q\" (UniqueName: \"kubernetes.io/projected/89157053-d5d1-40f0-8b36-411d637d8385-kube-api-access-49l9q\") pod \"console-f9d7485db-kjr8m\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.243614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5954\" (UniqueName: \"kubernetes.io/projected/2b018047-b659-49f1-a494-aaf29e2925e3-kube-api-access-r5954\") pod \"collect-profiles-29493750-s9qvj\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.262318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2ttr\" (UniqueName: \"kubernetes.io/projected/86bd5855-6284-402f-ad55-dd7ba2817439-kube-api-access-b2ttr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5w6qf\" (UID: \"86bd5855-6284-402f-ad55-dd7ba2817439\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.281437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sfbj\" (UniqueName: \"kubernetes.io/projected/c85cc396-a32d-4118-8d79-fc50352372ba-kube-api-access-4sfbj\") pod \"migrator-59844c95c7-lwwzh\" (UID: \"c85cc396-a32d-4118-8d79-fc50352372ba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.301215 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.301880 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjdrp\" (UniqueName: \"kubernetes.io/projected/384d6d65-9777-47a4-bef0-dbeeb9959e66-kube-api-access-cjdrp\") pod \"downloads-7954f5f757-8tmng\" (UID: \"384d6d65-9777-47a4-bef0-dbeeb9959e66\") " pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.315702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.338840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e795e7a2-8d9a-4ed3-8b70-bd1f003035a1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s5cct\" (UID: \"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.346879 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9dfj"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.351788 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mg8d"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.355008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89mw6\" (UniqueName: \"kubernetes.io/projected/1b4b2ca8-1515-43f9-81c5-613cf8c05f9f-kube-api-access-89mw6\") pod \"catalog-operator-68c6474976-g8k42\" (UID: \"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.360430 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqtw5"] Jan 28 18:36:14 crc kubenswrapper[4749]: W0128 18:36:14.361988 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf172bf9b_7eb3_46b1_8a40_9e566b01b433.slice/crio-379dbeb7011d7954bd267a571e18ec86d792e1a6299518daa4ab838ea4455672 WatchSource:0}: Error finding container 379dbeb7011d7954bd267a571e18ec86d792e1a6299518daa4ab838ea4455672: Status 404 returned error can't find the container with id 379dbeb7011d7954bd267a571e18ec86d792e1a6299518daa4ab838ea4455672 Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.362776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5m55\" (UniqueName: \"kubernetes.io/projected/9ed379e4-88c7-479e-9005-c7980ba50ccd-kube-api-access-z5m55\") pod \"router-default-5444994796-pdnfq\" (UID: \"9ed379e4-88c7-479e-9005-c7980ba50ccd\") " pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.368438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:14 crc kubenswrapper[4749]: W0128 18:36:14.370681 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae92b3db_2f69_410c_9cb0_6383fe6343ba.slice/crio-57c20d3ebaeea60b7cfa4839dbf6993d4ed4c64a49ea80508f3acbbf7e8c05d7 WatchSource:0}: Error finding container 57c20d3ebaeea60b7cfa4839dbf6993d4ed4c64a49ea80508f3acbbf7e8c05d7: Status 404 returned error can't find the container with id 57c20d3ebaeea60b7cfa4839dbf6993d4ed4c64a49ea80508f3acbbf7e8c05d7 Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.381719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntkf\" (UniqueName: \"kubernetes.io/projected/85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b-kube-api-access-wntkf\") pod \"machine-approver-56656f9798-g7gbx\" (UID: \"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.398415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs92j\" (UniqueName: \"kubernetes.io/projected/38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24-kube-api-access-rs92j\") pod \"service-ca-operator-777779d784-m2nf7\" (UID: \"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.409464 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.409588 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dhdg9"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.420583 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.423392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hmh\" (UniqueName: \"kubernetes.io/projected/e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b-kube-api-access-c4hmh\") pod \"multus-admission-controller-857f4d67dd-r2995\" (UID: \"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.427764 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.437705 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.439241 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.449237 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dxnj\" (UniqueName: \"kubernetes.io/projected/c23f9f6a-3b73-4429-8f26-a8c5e79963c9-kube-api-access-6dxnj\") pod \"package-server-manager-789f6589d5-lbzfp\" (UID: \"c23f9f6a-3b73-4429-8f26-a8c5e79963c9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.450680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.453498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.457049 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.461983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9cvh\" (UniqueName: \"kubernetes.io/projected/814398d6-5d05-4161-bd2e-3ff61d27f2c7-kube-api-access-z9cvh\") pod \"csi-hostpathplugin-xx5t5\" (UID: \"814398d6-5d05-4161-bd2e-3ff61d27f2c7\") " pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.467465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.473670 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.475172 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kx92t"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.482903 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gtqfj"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.483555 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.491130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvpg\" (UniqueName: \"kubernetes.io/projected/27aff929-50f9-48fe-b978-cbc83c1d6c66-kube-api-access-4mvpg\") pod \"machine-config-controller-84d6567774-jj2hf\" (UID: \"27aff929-50f9-48fe-b978-cbc83c1d6c66\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.499848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6xps\" (UniqueName: \"kubernetes.io/projected/67719fcc-4750-4f5e-a48f-e51cd8580903-kube-api-access-c6xps\") pod \"machine-config-server-6xtcc\" (UID: \"67719fcc-4750-4f5e-a48f-e51cd8580903\") " pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.503035 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.519571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htn8p\" (UniqueName: \"kubernetes.io/projected/24507e5f-704d-423c-a5dc-19d24cfb0014-kube-api-access-htn8p\") pod \"dns-default-gz9f7\" (UID: \"24507e5f-704d-423c-a5dc-19d24cfb0014\") " pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.525557 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.543699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw5lw\" (UniqueName: \"kubernetes.io/projected/2e43ff22-2928-48b1-b985-27926bcd5ef8-kube-api-access-dw5lw\") pod \"control-plane-machine-set-operator-78cbb6b69f-nr5b7\" (UID: \"2e43ff22-2928-48b1-b985-27926bcd5ef8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.553845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.555813 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7hp\" (UniqueName: \"kubernetes.io/projected/13b71160-3cba-4480-9180-22461548e389-kube-api-access-6r7hp\") pod \"service-ca-9c57cc56f-k59mg\" (UID: \"13b71160-3cba-4480-9180-22461548e389\") " pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.561667 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:36:14 crc kubenswrapper[4749]: W0128 18:36:14.567141 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611afe15_fc0f_4d7f_a734_5ea2e4e0e591.slice/crio-46c9c850bada82a91c7f7b4b5c65ac02ce0bcb83b382e7a2ba1e359ffe12f569 WatchSource:0}: Error finding container 46c9c850bada82a91c7f7b4b5c65ac02ce0bcb83b382e7a2ba1e359ffe12f569: Status 404 returned error can't find the container with id 46c9c850bada82a91c7f7b4b5c65ac02ce0bcb83b382e7a2ba1e359ffe12f569 Jan 28 18:36:14 crc kubenswrapper[4749]: W0128 18:36:14.570139 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af4e08d_4ce1_404c_b962_bb53ed89552c.slice/crio-85a9dd629970e8a4109381c646d6a9e7714435f47fe19b8e1cc750db3f52474b WatchSource:0}: Error finding container 85a9dd629970e8a4109381c646d6a9e7714435f47fe19b8e1cc750db3f52474b: Status 404 returned error can't find the container with id 85a9dd629970e8a4109381c646d6a9e7714435f47fe19b8e1cc750db3f52474b Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.570737 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.580654 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.583887 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.589178 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7h8\" (UniqueName: \"kubernetes.io/projected/43900a49-0d8f-48a8-b6af-385321464445-kube-api-access-bf7h8\") pod \"cni-sysctl-allowlist-ds-tms2k\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.604668 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.607588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj7jq\" (UniqueName: \"kubernetes.io/projected/d4d0f11e-2bf3-4b8e-9c42-aa763deade31-kube-api-access-zj7jq\") pod \"olm-operator-6b444d44fb-xtw8d\" (UID: \"d4d0f11e-2bf3-4b8e-9c42-aa763deade31\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.618378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6ctqb\" (UID: \"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.638824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.641735 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kjr8m"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.649563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.656701 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6xtcc" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.661062 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.680222 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.686209 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.732252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9468344-0b6e-436d-a92a-0d53ee9bb179-config\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.732876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90837a80-df80-457d-a061-ad3933620e19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.732913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8mc6\" (UniqueName: \"kubernetes.io/projected/d0092231-157d-498a-a811-c533dccee8ce-kube-api-access-v8mc6\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.732938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0092231-157d-498a-a811-c533dccee8ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.732956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9468344-0b6e-436d-a92a-0d53ee9bb179-serving-cert\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.732977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sd52\" (UniqueName: \"kubernetes.io/projected/802ae694-4a2b-4bb4-b826-179b981b28c2-kube-api-access-9sd52\") pod \"ingress-canary-8p6sp\" (UID: \"802ae694-4a2b-4bb4-b826-179b981b28c2\") " pod="openshift-ingress-canary/ingress-canary-8p6sp" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733030 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0400ab50-bd41-421e-a093-73dd02d7bf9a-images\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-trusted-ca\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af109a22-6845-4bb1-a272-ccfe19c51573-metrics-tls\") pod \"dns-operator-744455d44c-tzdvz\" (UID: \"af109a22-6845-4bb1-a272-ccfe19c51573\") " pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733128 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctnn7\" (UniqueName: \"kubernetes.io/projected/af109a22-6845-4bb1-a272-ccfe19c51573-kube-api-access-ctnn7\") pod \"dns-operator-744455d44c-tzdvz\" (UID: \"af109a22-6845-4bb1-a272-ccfe19c51573\") " pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d16abc4-7306-4cf8-b173-ae9e007f4519-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-registry-certificates\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733227 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90837a80-df80-457d-a061-ad3933620e19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a45db790-79fe-4928-b701-d64737024f60-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733278 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de83910-49eb-4462-850c-d2e7e84afb96-serving-cert\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733295 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-service-ca\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a45db790-79fe-4928-b701-d64737024f60-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d16abc4-7306-4cf8-b173-ae9e007f4519-metrics-tls\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733391 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcx6p\" (UniqueName: \"kubernetes.io/projected/d9468344-0b6e-436d-a92a-0d53ee9bb179-kube-api-access-kcx6p\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-client\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0092231-157d-498a-a811-c533dccee8ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-registry-tls\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733603 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-ca\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0400ab50-bd41-421e-a093-73dd02d7bf9a-proxy-tls\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-bound-sa-token\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d16abc4-7306-4cf8-b173-ae9e007f4519-trusted-ca\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.733879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp88z\" (UniqueName: \"kubernetes.io/projected/90837a80-df80-457d-a061-ad3933620e19-kube-api-access-qp88z\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.734202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgwr\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-kube-api-access-2sgwr\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.735381 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9468344-0b6e-436d-a92a-0d53ee9bb179-trusted-ca\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: E0128 18:36:14.735393 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.235375137 +0000 UTC m=+43.246901912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.735483 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmnrx\" (UniqueName: \"kubernetes.io/projected/0400ab50-bd41-421e-a093-73dd02d7bf9a-kube-api-access-nmnrx\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.735654 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfrq4\" (UniqueName: \"kubernetes.io/projected/6de83910-49eb-4462-850c-d2e7e84afb96-kube-api-access-mfrq4\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.735966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0400ab50-bd41-421e-a093-73dd02d7bf9a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.736075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802ae694-4a2b-4bb4-b826-179b981b28c2-cert\") pod \"ingress-canary-8p6sp\" (UID: \"802ae694-4a2b-4bb4-b826-179b981b28c2\") " pod="openshift-ingress-canary/ingress-canary-8p6sp" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.736231 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-config\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.736532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf4xp\" (UniqueName: \"kubernetes.io/projected/9d16abc4-7306-4cf8-b173-ae9e007f4519-kube-api-access-vf4xp\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.822024 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.838490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.838834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9468344-0b6e-436d-a92a-0d53ee9bb179-trusted-ca\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.838876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmnrx\" (UniqueName: \"kubernetes.io/projected/0400ab50-bd41-421e-a093-73dd02d7bf9a-kube-api-access-nmnrx\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.838904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfrq4\" (UniqueName: \"kubernetes.io/projected/6de83910-49eb-4462-850c-d2e7e84afb96-kube-api-access-mfrq4\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.838931 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0400ab50-bd41-421e-a093-73dd02d7bf9a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.838955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802ae694-4a2b-4bb4-b826-179b981b28c2-cert\") pod \"ingress-canary-8p6sp\" (UID: \"802ae694-4a2b-4bb4-b826-179b981b28c2\") " pod="openshift-ingress-canary/ingress-canary-8p6sp" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.838972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-config\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.838987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf4xp\" (UniqueName: \"kubernetes.io/projected/9d16abc4-7306-4cf8-b173-ae9e007f4519-kube-api-access-vf4xp\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9468344-0b6e-436d-a92a-0d53ee9bb179-config\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90837a80-df80-457d-a061-ad3933620e19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8mc6\" (UniqueName: \"kubernetes.io/projected/d0092231-157d-498a-a811-c533dccee8ce-kube-api-access-v8mc6\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0092231-157d-498a-a811-c533dccee8ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9468344-0b6e-436d-a92a-0d53ee9bb179-serving-cert\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sd52\" (UniqueName: \"kubernetes.io/projected/802ae694-4a2b-4bb4-b826-179b981b28c2-kube-api-access-9sd52\") pod \"ingress-canary-8p6sp\" (UID: \"802ae694-4a2b-4bb4-b826-179b981b28c2\") " pod="openshift-ingress-canary/ingress-canary-8p6sp" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-trusted-ca\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af109a22-6845-4bb1-a272-ccfe19c51573-metrics-tls\") pod \"dns-operator-744455d44c-tzdvz\" (UID: \"af109a22-6845-4bb1-a272-ccfe19c51573\") " pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0400ab50-bd41-421e-a093-73dd02d7bf9a-images\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839345 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d16abc4-7306-4cf8-b173-ae9e007f4519-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-registry-certificates\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctnn7\" (UniqueName: \"kubernetes.io/projected/af109a22-6845-4bb1-a272-ccfe19c51573-kube-api-access-ctnn7\") pod \"dns-operator-744455d44c-tzdvz\" (UID: \"af109a22-6845-4bb1-a272-ccfe19c51573\") " pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90837a80-df80-457d-a061-ad3933620e19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a45db790-79fe-4928-b701-d64737024f60-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de83910-49eb-4462-850c-d2e7e84afb96-serving-cert\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-service-ca\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a45db790-79fe-4928-b701-d64737024f60-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d16abc4-7306-4cf8-b173-ae9e007f4519-metrics-tls\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839554 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcx6p\" (UniqueName: \"kubernetes.io/projected/d9468344-0b6e-436d-a92a-0d53ee9bb179-kube-api-access-kcx6p\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839599 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-client\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0092231-157d-498a-a811-c533dccee8ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-registry-tls\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-ca\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-bound-sa-token\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0400ab50-bd41-421e-a093-73dd02d7bf9a-proxy-tls\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d16abc4-7306-4cf8-b173-ae9e007f4519-trusted-ca\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp88z\" (UniqueName: \"kubernetes.io/projected/90837a80-df80-457d-a061-ad3933620e19-kube-api-access-qp88z\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.839836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sgwr\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-kube-api-access-2sgwr\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: E0128 18:36:14.861936 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.361897194 +0000 UTC m=+43.373423969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.865038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9468344-0b6e-436d-a92a-0d53ee9bb179-trusted-ca\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.865447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d16abc4-7306-4cf8-b173-ae9e007f4519-trusted-ca\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.868286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-trusted-ca\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.870249 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90837a80-df80-457d-a061-ad3933620e19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.871434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0092231-157d-498a-a811-c533dccee8ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.871947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0400ab50-bd41-421e-a093-73dd02d7bf9a-images\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.872184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a45db790-79fe-4928-b701-d64737024f60-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.872291 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-registry-certificates\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.874196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d16abc4-7306-4cf8-b173-ae9e007f4519-metrics-tls\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.874212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-ca\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.874525 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.874701 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0400ab50-bd41-421e-a093-73dd02d7bf9a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.874822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-registry-tls\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.874955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0092231-157d-498a-a811-c533dccee8ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.875537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90837a80-df80-457d-a061-ad3933620e19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.881859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9468344-0b6e-436d-a92a-0d53ee9bb179-config\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.886192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-config\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.886497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-service-ca\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.887091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sgwr\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-kube-api-access-2sgwr\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.891837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de83910-49eb-4462-850c-d2e7e84afb96-serving-cert\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.892559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0400ab50-bd41-421e-a093-73dd02d7bf9a-proxy-tls\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.892838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6de83910-49eb-4462-850c-d2e7e84afb96-etcd-client\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.893277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9468344-0b6e-436d-a92a-0d53ee9bb179-serving-cert\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.894539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a45db790-79fe-4928-b701-d64737024f60-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.895387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802ae694-4a2b-4bb4-b826-179b981b28c2-cert\") pod \"ingress-canary-8p6sp\" (UID: \"802ae694-4a2b-4bb4-b826-179b981b28c2\") " pod="openshift-ingress-canary/ingress-canary-8p6sp" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.902739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af109a22-6845-4bb1-a272-ccfe19c51573-metrics-tls\") pod \"dns-operator-744455d44c-tzdvz\" (UID: \"af109a22-6845-4bb1-a272-ccfe19c51573\") " pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.904878 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-r2995"] Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.909790 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.923293 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp88z\" (UniqueName: \"kubernetes.io/projected/90837a80-df80-457d-a061-ad3933620e19-kube-api-access-qp88z\") pod \"kube-storage-version-migrator-operator-b67b599dd-n56c5\" (UID: \"90837a80-df80-457d-a061-ad3933620e19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.942409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d16abc4-7306-4cf8-b173-ae9e007f4519-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.942608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:14 crc kubenswrapper[4749]: E0128 18:36:14.944770 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.44474172 +0000 UTC m=+43.456268685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.974139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmnrx\" (UniqueName: \"kubernetes.io/projected/0400ab50-bd41-421e-a093-73dd02d7bf9a-kube-api-access-nmnrx\") pod \"machine-config-operator-74547568cd-srdjk\" (UID: \"0400ab50-bd41-421e-a093-73dd02d7bf9a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.978958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfrq4\" (UniqueName: \"kubernetes.io/projected/6de83910-49eb-4462-850c-d2e7e84afb96-kube-api-access-mfrq4\") pod \"etcd-operator-b45778765-6lqrz\" (UID: \"6de83910-49eb-4462-850c-d2e7e84afb96\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:14 crc kubenswrapper[4749]: I0128 18:36:14.999485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sd52\" (UniqueName: \"kubernetes.io/projected/802ae694-4a2b-4bb4-b826-179b981b28c2-kube-api-access-9sd52\") pod \"ingress-canary-8p6sp\" (UID: \"802ae694-4a2b-4bb4-b826-179b981b28c2\") " pod="openshift-ingress-canary/ingress-canary-8p6sp" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.019177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8mc6\" (UniqueName: \"kubernetes.io/projected/d0092231-157d-498a-a811-c533dccee8ce-kube-api-access-v8mc6\") pod \"marketplace-operator-79b997595-z74vl\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:15 crc kubenswrapper[4749]: W0128 18:36:15.022024 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c3a6c4_f7e1_4b36_9b49_e9264fb44d5b.slice/crio-d5a1c1ebfaae84b965ac7f604f04f86762f1774448b1f2780130c96b3d3dcecf WatchSource:0}: Error finding container d5a1c1ebfaae84b965ac7f604f04f86762f1774448b1f2780130c96b3d3dcecf: Status 404 returned error can't find the container with id d5a1c1ebfaae84b965ac7f604f04f86762f1774448b1f2780130c96b3d3dcecf Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.045825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.046310 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.546286573 +0000 UTC m=+43.557813348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.053002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctnn7\" (UniqueName: \"kubernetes.io/projected/af109a22-6845-4bb1-a272-ccfe19c51573-kube-api-access-ctnn7\") pod \"dns-operator-744455d44c-tzdvz\" (UID: \"af109a22-6845-4bb1-a272-ccfe19c51573\") " pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.062699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-bound-sa-token\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.078173 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcx6p\" (UniqueName: \"kubernetes.io/projected/d9468344-0b6e-436d-a92a-0d53ee9bb179-kube-api-access-kcx6p\") pod \"console-operator-58897d9998-vddcm\" (UID: \"d9468344-0b6e-436d-a92a-0d53ee9bb179\") " pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.123418 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.127967 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.130465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" event={"ID":"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24","Type":"ContainerStarted","Data":"aab20796f368195888a989d9bcdbcf1de46781e180a8baf4a4f6390ed26b366b"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.136084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" event={"ID":"86bd5855-6284-402f-ad55-dd7ba2817439","Type":"ContainerStarted","Data":"8ec4d527117841fba5eb5e35b8c6d1dcd7370cf2a86b8321b1b1c010cb18462b"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.137513 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" event={"ID":"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c","Type":"ContainerStarted","Data":"994f08b744aaba813554a91fb3e0f9b578e69fe9ecb3b13936d2f486bdeccf8a"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.138421 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.139498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.140318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" event={"ID":"4f3a2df0-0830-4e78-a168-31171cf06b76","Type":"ContainerStarted","Data":"5573b792054a4fc452aea653e05c6446b98b91b16923cd5446799b23ffc251b6"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.140400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" event={"ID":"4f3a2df0-0830-4e78-a168-31171cf06b76","Type":"ContainerStarted","Data":"670ace17908b647ea2514b8b25a43bf80ad83034024842eba8b8bd6429e3b649"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.140889 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.142486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" event={"ID":"ef9d771b-78e9-4131-afe5-f1f90025783e","Type":"ContainerStarted","Data":"624235cb231d88f79a44e592a62237815596b56b312b7c9ebdbb4a1b7047bc56"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.143826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" event={"ID":"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a","Type":"ContainerStarted","Data":"70c0a40bcb80bed6b8f34417866388214324ca409d40b926e6a176c6662ba1f3"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.144544 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pqtw5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.144598 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.149236 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.149894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-thsj6" event={"ID":"513ef52a-4532-409e-b188-5101ab5a3fff","Type":"ContainerStarted","Data":"0bb4d6650bace7333b2916bc8aa84bc86754162870df1fc7c65c5b2e932de386"} Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.154269 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.654241275 +0000 UTC m=+43.665768050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.154491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kjr8m" event={"ID":"89157053-d5d1-40f0-8b36-411d637d8385","Type":"ContainerStarted","Data":"b10ad17b9a532ff3723c33954acdbd1251567d400e6d17feb30622ba7fc10934"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.161591 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" event={"ID":"ae92b3db-2f69-410c-9cb0-6383fe6343ba","Type":"ContainerStarted","Data":"aecddd70f2e0293cacfb83ae18831f6a66ba45e3006f3455da203df95d8a2d39"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.161634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" event={"ID":"ae92b3db-2f69-410c-9cb0-6383fe6343ba","Type":"ContainerStarted","Data":"57c20d3ebaeea60b7cfa4839dbf6993d4ed4c64a49ea80508f3acbbf7e8c05d7"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.165056 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" event={"ID":"9af4e08d-4ce1-404c-b962-bb53ed89552c","Type":"ContainerStarted","Data":"85a9dd629970e8a4109381c646d6a9e7714435f47fe19b8e1cc750db3f52474b"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.166920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" event={"ID":"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b","Type":"ContainerStarted","Data":"d5a1c1ebfaae84b965ac7f604f04f86762f1774448b1f2780130c96b3d3dcecf"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.175353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" event={"ID":"f1f9743c-bcb4-47c0-8c0d-23530ecca520","Type":"ContainerStarted","Data":"cabaf9107a6ac9e85e166bca1170d8237512e14074c7ed0a437703fe84dfbd84"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.177680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" event={"ID":"611afe15-fc0f-4d7f-a734-5ea2e4e0e591","Type":"ContainerStarted","Data":"46c9c850bada82a91c7f7b4b5c65ac02ce0bcb83b382e7a2ba1e359ffe12f569"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.185140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf4xp\" (UniqueName: \"kubernetes.io/projected/9d16abc4-7306-4cf8-b173-ae9e007f4519-kube-api-access-vf4xp\") pod \"ingress-operator-5b745b69d9-r56pg\" (UID: \"9d16abc4-7306-4cf8-b173-ae9e007f4519\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.185628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" event={"ID":"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e","Type":"ContainerStarted","Data":"14f6a585a191996c28599333710916de8f8724e55e58c7c6f0fd7238028e59ee"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.198012 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.199047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.202496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" event={"ID":"f172bf9b-7eb3-46b1-8a40-9e566b01b433","Type":"ContainerStarted","Data":"59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.202578 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.202594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" event={"ID":"f172bf9b-7eb3-46b1-8a40-9e566b01b433","Type":"ContainerStarted","Data":"379dbeb7011d7954bd267a571e18ec86d792e1a6299518daa4ab838ea4455672"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.204613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" event={"ID":"cec9a986-a9e9-4d5d-af46-c913af6cc8f7","Type":"ContainerStarted","Data":"979c9daf97d0fcf1db9697b270ccae749fc29d6130e717e2eacb736576f3d371"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.209534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" event={"ID":"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b","Type":"ContainerStarted","Data":"48bd8676b6881534dc3cce6487128b8599b22ee3e7baa84ed20d488e6db7a4a7"} Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.217202 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.228977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.231670 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r9dfj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.231823 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" podUID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.231882 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.244218 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.247626 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8p6sp" Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.250440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.252132 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.752110329 +0000 UTC m=+43.763637114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.256048 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.268557 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.270827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.284260 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.352897 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.356995 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.856970154 +0000 UTC m=+43.868497139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.460499 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.463415 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.963386118 +0000 UTC m=+43.974912893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.465425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.465870 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:15.965860769 +0000 UTC m=+43.977387544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: W0128 18:36:15.502981 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43900a49_0d8f_48a8_b6af_385321464445.slice/crio-15f563570ee42693245cfffb52b00a29355508cc34073d5a4bcbb1ae3f50ea78 WatchSource:0}: Error finding container 15f563570ee42693245cfffb52b00a29355508cc34073d5a4bcbb1ae3f50ea78: Status 404 returned error can't find the container with id 15f563570ee42693245cfffb52b00a29355508cc34073d5a4bcbb1ae3f50ea78 Jan 28 18:36:15 crc kubenswrapper[4749]: W0128 18:36:15.515976 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4b2ca8_1515_43f9_81c5_613cf8c05f9f.slice/crio-4a0cae885d28a25a5cfbacec4dd2998fcf0e2ff897602cb19d3e4fdd9036af0d WatchSource:0}: Error finding container 4a0cae885d28a25a5cfbacec4dd2998fcf0e2ff897602cb19d3e4fdd9036af0d: Status 404 returned error can't find the container with id 4a0cae885d28a25a5cfbacec4dd2998fcf0e2ff897602cb19d3e4fdd9036af0d Jan 28 18:36:15 crc kubenswrapper[4749]: W0128 18:36:15.526702 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e43ff22_2928_48b1_b985_27926bcd5ef8.slice/crio-75bd6c310ca50848e6c83d4658bf104defdf1b944aba2d5a19e8af42bba3ba33 WatchSource:0}: Error finding container 75bd6c310ca50848e6c83d4658bf104defdf1b944aba2d5a19e8af42bba3ba33: Status 404 returned error can't find the container with id 75bd6c310ca50848e6c83d4658bf104defdf1b944aba2d5a19e8af42bba3ba33 Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.551528 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8tmng"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.566193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.566483 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.06646894 +0000 UTC m=+44.077995715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.641619 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.668897 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.676828 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.176773659 +0000 UTC m=+44.188300424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.679682 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k59mg"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.697621 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xx5t5"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.705429 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gz9f7"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.755994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.771512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.771726 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.271695051 +0000 UTC m=+44.283221826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.772194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.774821 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.274796506 +0000 UTC m=+44.286323281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: W0128 18:36:15.792808 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc23f9f6a_3b73_4429_8f26_a8c5e79963c9.slice/crio-5da8e2eea85c6bdec2b3febf12a424ad61cdec69c214f18534bdf24fa45c3de0 WatchSource:0}: Error finding container 5da8e2eea85c6bdec2b3febf12a424ad61cdec69c214f18534bdf24fa45c3de0: Status 404 returned error can't find the container with id 5da8e2eea85c6bdec2b3febf12a424ad61cdec69c214f18534bdf24fa45c3de0 Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.794005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d"] Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.872898 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:15 crc kubenswrapper[4749]: E0128 18:36:15.874483 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.374453375 +0000 UTC m=+44.385980150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:15 crc kubenswrapper[4749]: W0128 18:36:15.894840 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d0f11e_2bf3_4b8e_9c42_aa763deade31.slice/crio-0b7a9ef1fbd9be089b46bfafa7ab1bca1ea2dcbb2e1805461364ff3297c14d86 WatchSource:0}: Error finding container 0b7a9ef1fbd9be089b46bfafa7ab1bca1ea2dcbb2e1805461364ff3297c14d86: Status 404 returned error can't find the container with id 0b7a9ef1fbd9be089b46bfafa7ab1bca1ea2dcbb2e1805461364ff3297c14d86 Jan 28 18:36:15 crc kubenswrapper[4749]: I0128 18:36:15.988825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.000227 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.500200123 +0000 UTC m=+44.511726908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.094901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.096282 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.596254102 +0000 UTC m=+44.607780877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.198659 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.199412 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.699391176 +0000 UTC m=+44.710917951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.222779 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tzdvz"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.279275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" event={"ID":"ae92b3db-2f69-410c-9cb0-6383fe6343ba","Type":"ContainerStarted","Data":"39232fb9c10d8f82477a68d1a3c1364b3388ae6b70e9936ded4ffec4a84d1afb"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.284114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8tmng" event={"ID":"384d6d65-9777-47a4-bef0-dbeeb9959e66","Type":"ContainerStarted","Data":"e9906042d36ff5f260a992baa756cf0a570b66a823e1e7e4b798ebfe45a8ba08"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.289306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" event={"ID":"2b018047-b659-49f1-a494-aaf29e2925e3","Type":"ContainerStarted","Data":"9fd350c98028a35487d88ffa0cbb702595f42a89a2d127287c1a37197d3b0592"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.299924 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.300171 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.80012652 +0000 UTC m=+44.811653295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.301109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.302250 4749 generic.go:334] "Generic (PLEG): container finished" podID="9af4e08d-4ce1-404c-b962-bb53ed89552c" containerID="f3ee7e2f41ac00882b796cdf5d1d7aa081584029ec8c5d6b817e8774f6f25a68" exitCode=0 Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.303454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" event={"ID":"9af4e08d-4ce1-404c-b962-bb53ed89552c","Type":"ContainerDied","Data":"f3ee7e2f41ac00882b796cdf5d1d7aa081584029ec8c5d6b817e8774f6f25a68"} Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.304815 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.801557395 +0000 UTC m=+44.813084180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.312393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" event={"ID":"86bd5855-6284-402f-ad55-dd7ba2817439","Type":"ContainerStarted","Data":"b8b116db34a8fa3186e6a57577d4f2eb84d8e39d06168aba6d5f0d2fc186813f"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.317036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" event={"ID":"2fcec2bb-dbd8-4850-8b2f-0a64535bca7a","Type":"ContainerStarted","Data":"0c4797c5abd055b628328d123f400f76c212c1c4f3c2be5601506a8f417c3326"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.347290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" event={"ID":"8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c","Type":"ContainerStarted","Data":"60f2519abd4028b4dde092088b07b7d37ec2e96b9e2c9f62f5a0d0470c76e704"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.348715 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.358717 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gzsjh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.358793 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" podUID="8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.371179 4749 generic.go:334] "Generic (PLEG): container finished" podID="611afe15-fc0f-4d7f-a734-5ea2e4e0e591" containerID="4aae3857ef7a355d84d353114d27e59358b92c43cdd72617f2468783b349b73a" exitCode=0 Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.371401 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" event={"ID":"611afe15-fc0f-4d7f-a734-5ea2e4e0e591","Type":"ContainerDied","Data":"4aae3857ef7a355d84d353114d27e59358b92c43cdd72617f2468783b349b73a"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.387024 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" event={"ID":"d4d0f11e-2bf3-4b8e-9c42-aa763deade31","Type":"ContainerStarted","Data":"0b7a9ef1fbd9be089b46bfafa7ab1bca1ea2dcbb2e1805461364ff3297c14d86"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.403677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.405762 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:16.905740624 +0000 UTC m=+44.917267399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.411048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" event={"ID":"c85cc396-a32d-4118-8d79-fc50352372ba","Type":"ContainerStarted","Data":"fc930079945a17da6014728c602b91f927f0986852d9446273e3de9436d79a3a"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.434443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" event={"ID":"cec9a986-a9e9-4d5d-af46-c913af6cc8f7","Type":"ContainerStarted","Data":"ceb31a03f4f42ba55e983fa50375df7473c9437eb4a5fb0ec52af9fcbd824dcb"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.440125 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6lqrz"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.457903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" event={"ID":"43900a49-0d8f-48a8-b6af-385321464445","Type":"ContainerStarted","Data":"15f563570ee42693245cfffb52b00a29355508cc34073d5a4bcbb1ae3f50ea78"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.468855 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.472747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.478077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" event={"ID":"c23f9f6a-3b73-4429-8f26-a8c5e79963c9","Type":"ContainerStarted","Data":"5da8e2eea85c6bdec2b3febf12a424ad61cdec69c214f18534bdf24fa45c3de0"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.500770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gz9f7" event={"ID":"24507e5f-704d-423c-a5dc-19d24cfb0014","Type":"ContainerStarted","Data":"928e030093d30a536b73bb895154e74eb2d0fdd0acdf1e5dbd983b7a28dccdbb"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.505801 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.510028 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.009985805 +0000 UTC m=+45.021512580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.529814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" event={"ID":"ef9d771b-78e9-4131-afe5-f1f90025783e","Type":"ContainerStarted","Data":"771c5ec922e282d87f24d0289dcdfe76b061e706fdb32807c6fb74616faca088"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.533860 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pdnfq" event={"ID":"9ed379e4-88c7-479e-9005-c7980ba50ccd","Type":"ContainerStarted","Data":"ae5780f121e928300a16fc1b3496764e74c8185061c01e872f0dc057d0d8a563"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.549684 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-thsj6" podStartSLOduration=19.549668609 podStartE2EDuration="19.549668609s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:16.546880311 +0000 UTC m=+44.558407086" watchObservedRunningTime="2026-01-28 18:36:16.549668609 +0000 UTC m=+44.561195384" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.558626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" event={"ID":"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b","Type":"ContainerStarted","Data":"e51f59b490911c3fe93e40bdea77125baea443f48fbb2d199a3dc921c67324c6"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.565720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" event={"ID":"814398d6-5d05-4161-bd2e-3ff61d27f2c7","Type":"ContainerStarted","Data":"6a24b798b18caf76e2e0953d5b38f522c60f7dc08a262505483a60375299fd82"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.573289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kjr8m" event={"ID":"89157053-d5d1-40f0-8b36-411d637d8385","Type":"ContainerStarted","Data":"8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.581711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" event={"ID":"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1","Type":"ContainerStarted","Data":"054178bb61aa05343cc1507c0e71ae24b7622d3d2152bc3ad1b43f3417deff65"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.596608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" event={"ID":"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4","Type":"ContainerStarted","Data":"5c79edec8b458850f4a2bd0a1cb5ee2cddabc5ecca93e8575c2f6a72fef8c321"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.602096 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" podStartSLOduration=20.602065145 podStartE2EDuration="20.602065145s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:16.592701316 +0000 UTC m=+44.604228101" watchObservedRunningTime="2026-01-28 18:36:16.602065145 +0000 UTC m=+44.613591920" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.608225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.611845 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.111777644 +0000 UTC m=+45.123304419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.613541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6xtcc" event={"ID":"67719fcc-4750-4f5e-a48f-e51cd8580903","Type":"ContainerStarted","Data":"ab47e1e3562da1d8ac37c386d4d711489936ed7170b262000da8aceded1e6c0c"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.643712 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" podStartSLOduration=19.643675108 podStartE2EDuration="19.643675108s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:16.620072028 +0000 UTC m=+44.631598823" watchObservedRunningTime="2026-01-28 18:36:16.643675108 +0000 UTC m=+44.655201883" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.645078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" event={"ID":"13b71160-3cba-4480-9180-22461548e389","Type":"ContainerStarted","Data":"605ec763c4dcb8071def3af3ef4fa05ef32d4f366e1e60c615d2722eea78a1d7"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.657537 4749 generic.go:334] "Generic (PLEG): container finished" podID="f1f9743c-bcb4-47c0-8c0d-23530ecca520" containerID="b9a18d865e52b5d9810a1259696e30e926ff879466489dd8068712ba3869f709" exitCode=0 Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.657633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" event={"ID":"f1f9743c-bcb4-47c0-8c0d-23530ecca520","Type":"ContainerDied","Data":"b9a18d865e52b5d9810a1259696e30e926ff879466489dd8068712ba3869f709"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.661140 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" event={"ID":"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e","Type":"ContainerStarted","Data":"c1f0f7551a25556324220ec1025cbfe2e98b52191b4dd409348412cccc6d7163"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.662594 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.675660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" event={"ID":"27aff929-50f9-48fe-b978-cbc83c1d6c66","Type":"ContainerStarted","Data":"fab3fc00a5a49330638be59fb079fbae835141007946b83f597c9bf93c4ed0f2"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.683841 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8mq6z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.683917 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" podUID="e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.685466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" event={"ID":"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f","Type":"ContainerStarted","Data":"4a0cae885d28a25a5cfbacec4dd2998fcf0e2ff897602cb19d3e4fdd9036af0d"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.687133 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.693824 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.706730 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" event={"ID":"7a9afb6a-713a-476f-9be4-84eabb0905de","Type":"ContainerStarted","Data":"47401607f220ee3d899c2679eba47b97f09c60ec60a185d6c05d22417a878838"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.711467 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.719999 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.219980072 +0000 UTC m=+45.231506837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.721395 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g8k42 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.721492 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" podUID="1b4b2ca8-1515-43f9-81c5-613cf8c05f9f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.722472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" event={"ID":"2e43ff22-2928-48b1-b985-27926bcd5ef8","Type":"ContainerStarted","Data":"75bd6c310ca50848e6c83d4658bf104defdf1b944aba2d5a19e8af42bba3ba33"} Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.724424 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pqtw5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.724479 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.727929 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8p6sp"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.732425 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-gtqfj" podStartSLOduration=20.732389636 podStartE2EDuration="20.732389636s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:16.722006622 +0000 UTC m=+44.733533407" watchObservedRunningTime="2026-01-28 18:36:16.732389636 +0000 UTC m=+44.743916431" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.738758 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vddcm"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.761267 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5w6qf" podStartSLOduration=19.761245065 podStartE2EDuration="19.761245065s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:16.75978375 +0000 UTC m=+44.771310545" watchObservedRunningTime="2026-01-28 18:36:16.761245065 +0000 UTC m=+44.772771840" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.787532 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.787646 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" podStartSLOduration=19.787621724 podStartE2EDuration="19.787621724s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:16.778955061 +0000 UTC m=+44.790481856" watchObservedRunningTime="2026-01-28 18:36:16.787621724 +0000 UTC m=+44.799148509" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.808289 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z74vl"] Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.821395 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-47c58" podStartSLOduration=20.821367752 podStartE2EDuration="20.821367752s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:16.821310051 +0000 UTC m=+44.832836846" watchObservedRunningTime="2026-01-28 18:36:16.821367752 +0000 UTC m=+44.832894527" Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.822663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.823918 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.323896285 +0000 UTC m=+45.335423060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.826481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.832320 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.332298291 +0000 UTC m=+45.343825276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:16 crc kubenswrapper[4749]: W0128 18:36:16.833557 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9468344_0b6e_436d_a92a_0d53ee9bb179.slice/crio-1baa78e10722256d90d9b21f17f40977304f5741c248ad41599827832bbc85f8 WatchSource:0}: Error finding container 1baa78e10722256d90d9b21f17f40977304f5741c248ad41599827832bbc85f8: Status 404 returned error can't find the container with id 1baa78e10722256d90d9b21f17f40977304f5741c248ad41599827832bbc85f8 Jan 28 18:36:16 crc kubenswrapper[4749]: I0128 18:36:16.940211 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:16 crc kubenswrapper[4749]: E0128 18:36:16.940617 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.440601151 +0000 UTC m=+45.452127926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.046153 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.046592 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mg8d" podStartSLOduration=20.046575614 podStartE2EDuration="20.046575614s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:16.931129628 +0000 UTC m=+44.942656413" watchObservedRunningTime="2026-01-28 18:36:17.046575614 +0000 UTC m=+45.058102389" Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.046878 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.546865491 +0000 UTC m=+45.558392256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.130919 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" podStartSLOduration=20.130901365 podStartE2EDuration="20.130901365s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.129412808 +0000 UTC m=+45.140939593" watchObservedRunningTime="2026-01-28 18:36:17.130901365 +0000 UTC m=+45.142428140" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.154294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.154866 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.654842683 +0000 UTC m=+45.666369458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.262952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.263606 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.763577694 +0000 UTC m=+45.775104469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.362484 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-kjr8m" podStartSLOduration=20.362467042 podStartE2EDuration="20.362467042s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.360379791 +0000 UTC m=+45.371906586" watchObservedRunningTime="2026-01-28 18:36:17.362467042 +0000 UTC m=+45.373993837" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.364014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.364322 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.864289837 +0000 UTC m=+45.875816612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.455641 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" podStartSLOduration=20.45561744 podStartE2EDuration="20.45561744s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.453101458 +0000 UTC m=+45.464628243" watchObservedRunningTime="2026-01-28 18:36:17.45561744 +0000 UTC m=+45.467144215" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.473447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.473791 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:17.973778716 +0000 UTC m=+45.985305491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.579313 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" podStartSLOduration=20.579284447 podStartE2EDuration="20.579284447s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.577916254 +0000 UTC m=+45.589443029" watchObservedRunningTime="2026-01-28 18:36:17.579284447 +0000 UTC m=+45.590811222" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.580500 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.581027 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.081003189 +0000 UTC m=+46.092529964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.682838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.683167 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.183155188 +0000 UTC m=+46.194681963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.723966 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r9dfj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.724306 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" podUID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.752801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" event={"ID":"e795e7a2-8d9a-4ed3-8b70-bd1f003035a1","Type":"ContainerStarted","Data":"043832216c4c10eda560fde03537142592aade31a0b60add8b1b16326a8d3ced"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.784041 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.785864 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.285825061 +0000 UTC m=+46.297351846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.786919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" event={"ID":"2b018047-b659-49f1-a494-aaf29e2925e3","Type":"ContainerStarted","Data":"d8aca8147ef4c290231677590ba0026f2ec718ce5a59b716814f7bbb578df6a0"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.804358 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5cct" podStartSLOduration=20.804316735 podStartE2EDuration="20.804316735s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.794518434 +0000 UTC m=+45.806045219" watchObservedRunningTime="2026-01-28 18:36:17.804316735 +0000 UTC m=+45.815843510" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.818636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vddcm" event={"ID":"d9468344-0b6e-436d-a92a-0d53ee9bb179","Type":"ContainerStarted","Data":"1baa78e10722256d90d9b21f17f40977304f5741c248ad41599827832bbc85f8"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.824221 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" podStartSLOduration=20.824198943 podStartE2EDuration="20.824198943s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.819109709 +0000 UTC m=+45.830636494" watchObservedRunningTime="2026-01-28 18:36:17.824198943 +0000 UTC m=+45.835725718" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.833848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" event={"ID":"6de83910-49eb-4462-850c-d2e7e84afb96","Type":"ContainerStarted","Data":"9648d97e61ab263c930bdd48f0f121b01990917de13b3e2e29bf5e45e6b74eb4"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.854854 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" event={"ID":"27aff929-50f9-48fe-b978-cbc83c1d6c66","Type":"ContainerStarted","Data":"6612fa8ec814a5e80193d4ba20f7482555a6526b731349e9bd5e0336d5670234"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.854915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" event={"ID":"27aff929-50f9-48fe-b978-cbc83c1d6c66","Type":"ContainerStarted","Data":"b48bb887561ab2db823bb9b7a7c5f243f15c6b8e4e5355c3810eef424cfaffbf"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.881971 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jj2hf" podStartSLOduration=20.881949501 podStartE2EDuration="20.881949501s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.881924451 +0000 UTC m=+45.893451226" watchObservedRunningTime="2026-01-28 18:36:17.881949501 +0000 UTC m=+45.893476276" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.892896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.896141 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.39611386 +0000 UTC m=+46.407640635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.916639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" event={"ID":"00f6aad6-5565-4357-b6fb-bcf5ebd8dcd4","Type":"ContainerStarted","Data":"550e93f73fb3c2be2a8348f05dfc609122a5a51a5b26631ef1e6d280f7f03c7b"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.920592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" event={"ID":"d4d0f11e-2bf3-4b8e-9c42-aa763deade31","Type":"ContainerStarted","Data":"0b75c81dac79cf5166969291d10aec51de361354d25a73c4168fad963bc958d9"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.921466 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.933088 4749 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xtw8d container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.933319 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" podUID="d4d0f11e-2bf3-4b8e-9c42-aa763deade31" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.944584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" event={"ID":"38b4f55d-f2cd-4d07-93e1-f3c0cf4eca24","Type":"ContainerStarted","Data":"1a54709eb43e1ee8d335700d2ef21ba9a36cb29c245ed85f4a661d61cce1f926"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.953562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gz9f7" event={"ID":"24507e5f-704d-423c-a5dc-19d24cfb0014","Type":"ContainerStarted","Data":"ed70ebfc344e1754dd6a15d469fcb65ea2080eaf80455d979c25ab344d75eeb6"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.969903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" event={"ID":"85b1cb68-2bc1-4ac2-95ae-b91975c2fd8b","Type":"ContainerStarted","Data":"d574ae95f6449bc74a15a76070819e05ae584262e07d57470773c18e873eb331"} Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.974429 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42phc" podStartSLOduration=20.974404742 podStartE2EDuration="20.974404742s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.938097261 +0000 UTC m=+45.949624046" watchObservedRunningTime="2026-01-28 18:36:17.974404742 +0000 UTC m=+45.985931517" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.975870 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" podStartSLOduration=20.975862618 podStartE2EDuration="20.975862618s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:17.97432038 +0000 UTC m=+45.985847175" watchObservedRunningTime="2026-01-28 18:36:17.975862618 +0000 UTC m=+45.987389393" Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.996118 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.996268 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.496241619 +0000 UTC m=+46.507768394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:17 crc kubenswrapper[4749]: I0128 18:36:17.996462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:17 crc kubenswrapper[4749]: E0128 18:36:17.997946 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.49792769 +0000 UTC m=+46.509454685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.037552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" event={"ID":"13b71160-3cba-4480-9180-22461548e389","Type":"ContainerStarted","Data":"92fdb4d354c25b35f6760f175b3d02a0ed35fff173a5907a71d2061a24f21160"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.040887 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g7gbx" podStartSLOduration=22.040868135 podStartE2EDuration="22.040868135s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.038550618 +0000 UTC m=+46.050077413" watchObservedRunningTime="2026-01-28 18:36:18.040868135 +0000 UTC m=+46.052394910" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.047596 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" event={"ID":"9d16abc4-7306-4cf8-b173-ae9e007f4519","Type":"ContainerStarted","Data":"2e318a6469c45712905b5d0608779c22b7408c87c0834641c187a444e4561797"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.059666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" event={"ID":"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2","Type":"ContainerStarted","Data":"d76e75d084a05816e4a7c4a0f9964b2cd2b1790b4f1f9ce67af27653967a9e52"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.093847 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8p6sp" event={"ID":"802ae694-4a2b-4bb4-b826-179b981b28c2","Type":"ContainerStarted","Data":"6c8fb735293cdbb97cc5b2e4d430c4cc552b91008a2ddbe1dba95ccb7ee2a284"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.093904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8p6sp" event={"ID":"802ae694-4a2b-4bb4-b826-179b981b28c2","Type":"ContainerStarted","Data":"7729d15533c655f3d3aa585c589fa647c285bedf0b9271e24159105eeea89bb1"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.100013 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.101393 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.601374531 +0000 UTC m=+46.612901316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.116442 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2nf7" podStartSLOduration=21.116420561 podStartE2EDuration="21.116420561s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.077495474 +0000 UTC m=+46.089022259" watchObservedRunningTime="2026-01-28 18:36:18.116420561 +0000 UTC m=+46.127947336" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.136730 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" event={"ID":"7a9afb6a-713a-476f-9be4-84eabb0905de","Type":"ContainerStarted","Data":"cfca4f766d46e17f7d493899f62301bb1e7cbf16f02aea62109d8cb0857ae319"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.171819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" event={"ID":"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b","Type":"ContainerStarted","Data":"84d98d514ff26afd1d8049527e7421368763fac7b28e32a642f1a74aab13e568"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.188368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6xtcc" event={"ID":"67719fcc-4750-4f5e-a48f-e51cd8580903","Type":"ContainerStarted","Data":"02bb1b530198f83f0f2b558c5914ef90256e291288d0ea39fa3fe35ecb27e504"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.189847 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-k59mg" podStartSLOduration=21.189836314 podStartE2EDuration="21.189836314s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.123217698 +0000 UTC m=+46.134744463" watchObservedRunningTime="2026-01-28 18:36:18.189836314 +0000 UTC m=+46.201363089" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.203224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.204780 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.70476477 +0000 UTC m=+46.716291545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.216232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8tmng" event={"ID":"384d6d65-9777-47a4-bef0-dbeeb9959e66","Type":"ContainerStarted","Data":"97d3a1b0b0a6f64466783daecd9a3fd858e509354735dcc37f2df8c44538601d"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.220112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.234970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" event={"ID":"af109a22-6845-4bb1-a272-ccfe19c51573","Type":"ContainerStarted","Data":"2180a48d31cede983f5b98ce543009eab111609c16382789d13076aba85d98d3"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.247902 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.247974 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.249276 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wx864" podStartSLOduration=21.249259374 podStartE2EDuration="21.249259374s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.184930454 +0000 UTC m=+46.196457229" watchObservedRunningTime="2026-01-28 18:36:18.249259374 +0000 UTC m=+46.260786149" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.250780 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6xtcc" podStartSLOduration=7.250773881 podStartE2EDuration="7.250773881s" podCreationTimestamp="2026-01-28 18:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.248263149 +0000 UTC m=+46.259789944" watchObservedRunningTime="2026-01-28 18:36:18.250773881 +0000 UTC m=+46.262300656" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.284220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" event={"ID":"c85cc396-a32d-4118-8d79-fc50352372ba","Type":"ContainerStarted","Data":"0641742cc9dcdf7b86f9dbffcbc30b6025516092067c1e9bfa7b90b5e42b5667"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.307009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.307792 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.807777461 +0000 UTC m=+46.819304236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.316388 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8tmng" podStartSLOduration=21.316364212 podStartE2EDuration="21.316364212s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.309745639 +0000 UTC m=+46.321272434" watchObservedRunningTime="2026-01-28 18:36:18.316364212 +0000 UTC m=+46.327890987" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.336614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" event={"ID":"1b4b2ca8-1515-43f9-81c5-613cf8c05f9f","Type":"ContainerStarted","Data":"44b87d796049202ba9d7ba6e1672e2940d5b780c5d5cf742f154fdfda8f02f70"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.337963 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g8k42 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.337985 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" podUID="1b4b2ca8-1515-43f9-81c5-613cf8c05f9f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.373875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" event={"ID":"0400ab50-bd41-421e-a093-73dd02d7bf9a","Type":"ContainerStarted","Data":"3920c62286f3ae8c88ea592f507a4bd2160e4e086e60712814c95ffd87e0c713"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.373914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" event={"ID":"0400ab50-bd41-421e-a093-73dd02d7bf9a","Type":"ContainerStarted","Data":"33dbdd9416f61770a26e42978640abb7ff5813683caaadaecc6347d0d9783441"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.395095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" event={"ID":"90837a80-df80-457d-a061-ad3933620e19","Type":"ContainerStarted","Data":"0764ab9cea940b89b2aef30937c7d972733d1f8ad14d0957ffbe72d48fbac105"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.408500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.409749 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:18.909728134 +0000 UTC m=+46.921254989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.423524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nr5b7" event={"ID":"2e43ff22-2928-48b1-b985-27926bcd5ef8","Type":"ContainerStarted","Data":"311237fd63b3f5a5ce1c2f92c84415af251d804782dca152969c77d55c50cd89"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.430692 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" podStartSLOduration=21.430672949 podStartE2EDuration="21.430672949s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.429281635 +0000 UTC m=+46.440808430" watchObservedRunningTime="2026-01-28 18:36:18.430672949 +0000 UTC m=+46.442199724" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.441825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" event={"ID":"9af4e08d-4ce1-404c-b962-bb53ed89552c","Type":"ContainerStarted","Data":"5e1507f3c14499d6a8eeceeabe8eca1b218f7ac86bb2a16c353c2f865b385dc1"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.442168 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.446360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" event={"ID":"d0092231-157d-498a-a811-c533dccee8ce","Type":"ContainerStarted","Data":"f8ec94190709a5971dc7c17fbda6d64e08a4b6dd2d0828032572f49f24a52073"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.447365 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.451352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" event={"ID":"43900a49-0d8f-48a8-b6af-385321464445","Type":"ContainerStarted","Data":"c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.452094 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.455606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" event={"ID":"c23f9f6a-3b73-4429-8f26-a8c5e79963c9","Type":"ContainerStarted","Data":"a3f114195007f96fb967a5b3420b3f7e966a561124d62e93811de6450903e6ba"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.455640 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.455846 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z74vl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.456065 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.469519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" event={"ID":"ef9d771b-78e9-4131-afe5-f1f90025783e","Type":"ContainerStarted","Data":"1e90a10e49c1e8306322cbc602c6316ce5be2bd5da9c65068073c769d0d0eebc"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.473692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pdnfq" event={"ID":"9ed379e4-88c7-479e-9005-c7980ba50ccd","Type":"ContainerStarted","Data":"8c55615cfc132e6174686a05ba0c5c8dc1016a8c82d2108de4d3e79f41824b5a"} Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.512416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.513378 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" podStartSLOduration=21.513317949 podStartE2EDuration="21.513317949s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.510619982 +0000 UTC m=+46.522146777" watchObservedRunningTime="2026-01-28 18:36:18.513317949 +0000 UTC m=+46.524844714" Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.514455 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.014435746 +0000 UTC m=+47.025962521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.532522 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.592690 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" podStartSLOduration=21.592670748 podStartE2EDuration="21.592670748s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.585446741 +0000 UTC m=+46.596973536" watchObservedRunningTime="2026-01-28 18:36:18.592670748 +0000 UTC m=+46.604197523" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.619108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.622980 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.639782 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.139749935 +0000 UTC m=+47.151276710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.661663 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.680722 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" podStartSLOduration=21.68070573 podStartE2EDuration="21.68070573s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.679809638 +0000 UTC m=+46.691336423" watchObservedRunningTime="2026-01-28 18:36:18.68070573 +0000 UTC m=+46.692232505" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.692648 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:18 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:18 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:18 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.692727 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.716505 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wtt5v" podStartSLOduration=22.716481609 podStartE2EDuration="22.716481609s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.714962441 +0000 UTC m=+46.726489226" watchObservedRunningTime="2026-01-28 18:36:18.716481609 +0000 UTC m=+46.728008374" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.722837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.723119 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.223101321 +0000 UTC m=+47.234628096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.723209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.723561 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.223555152 +0000 UTC m=+47.235081927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.765590 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pdnfq" podStartSLOduration=21.765575765 podStartE2EDuration="21.765575765s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.76377769 +0000 UTC m=+46.775304485" watchObservedRunningTime="2026-01-28 18:36:18.765575765 +0000 UTC m=+46.777102540" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.823869 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.824573 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.324555784 +0000 UTC m=+47.336082559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.908091 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" podStartSLOduration=7.908060724 podStartE2EDuration="7.908060724s" podCreationTimestamp="2026-01-28 18:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:18.804799009 +0000 UTC m=+46.816325784" watchObservedRunningTime="2026-01-28 18:36:18.908060724 +0000 UTC m=+46.919587499" Jan 28 18:36:18 crc kubenswrapper[4749]: I0128 18:36:18.926261 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:18 crc kubenswrapper[4749]: E0128 18:36:18.926675 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.426663112 +0000 UTC m=+47.438189887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.028974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.029384 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.529367424 +0000 UTC m=+47.540894199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.130583 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.131174 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.631158734 +0000 UTC m=+47.642685509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.232101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.232274 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.732242767 +0000 UTC m=+47.743769542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.232382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.232789 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.73277894 +0000 UTC m=+47.744305785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.240022 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.333613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.333788 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.83376248 +0000 UTC m=+47.845289255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.334088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.334408 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.834398366 +0000 UTC m=+47.845925131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.435618 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.435884 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.935852998 +0000 UTC m=+47.947379773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.436028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.436532 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:19.936509754 +0000 UTC m=+47.948036579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.489634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" event={"ID":"d0092231-157d-498a-a811-c533dccee8ce","Type":"ContainerStarted","Data":"bb5be9913a3a37f304a7e73e831a5e4c4930b47ee3d0f71da8c7e5d162241abe"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.490588 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z74vl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.490693 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.495104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" event={"ID":"c85cc396-a32d-4118-8d79-fc50352372ba","Type":"ContainerStarted","Data":"e542cf71e1ed371467979684a47db7aa1b738c09bd8a0e5c02ed39c5a38b079a"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.500814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" event={"ID":"611afe15-fc0f-4d7f-a734-5ea2e4e0e591","Type":"ContainerStarted","Data":"d8a80755606f3e503245b0e6fb6e4d1ed71026f36bc94b540b2c6936e3621a2d"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.505804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" event={"ID":"0400ab50-bd41-421e-a093-73dd02d7bf9a","Type":"ContainerStarted","Data":"28e6a05a6205f5ddbda0f4aed9aedf18c4a4dd84061eaca342013331a5e22523"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.508010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vddcm" event={"ID":"d9468344-0b6e-436d-a92a-0d53ee9bb179","Type":"ContainerStarted","Data":"28b8976807b7e43b4d8f1becfd76e979c0fac868950688294a19a85476dbe217"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.508202 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.509703 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-vddcm container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/readyz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.509756 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vddcm" podUID="d9468344-0b6e-436d-a92a-0d53ee9bb179" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/readyz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.511528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" event={"ID":"b69c1f3e-a87d-4d0c-8d31-2286c4ec30e2","Type":"ContainerStarted","Data":"1d50080275ef38a27138d10d09fd1d4cd0d06bd604ea24aa0c9d1645e6e1d17b"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.517006 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" event={"ID":"f1f9743c-bcb4-47c0-8c0d-23530ecca520","Type":"ContainerStarted","Data":"2567f2442271d00daa84e37f539e2ebddd1f0c54dfc735c09b6487811f43b697"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.517059 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" event={"ID":"f1f9743c-bcb4-47c0-8c0d-23530ecca520","Type":"ContainerStarted","Data":"dcd6a589d6c6232726de2f69c8672f09399e5f7d03ef17cebbd6cd00f94c3c76"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.521979 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwwzh" podStartSLOduration=22.521955443 podStartE2EDuration="22.521955443s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.518561039 +0000 UTC m=+47.530087814" watchObservedRunningTime="2026-01-28 18:36:19.521955443 +0000 UTC m=+47.533482228" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.522788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" event={"ID":"6de83910-49eb-4462-850c-d2e7e84afb96","Type":"ContainerStarted","Data":"954c757bc93bdf8e6f6c96db07e15675b06a5b63c979e178c06916ba1d61d475"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.535649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gz9f7" event={"ID":"24507e5f-704d-423c-a5dc-19d24cfb0014","Type":"ContainerStarted","Data":"44dffe1521243e3438a897c8ae89e9edea46bacc4e39e14a8e35bdc7a23f2cb0"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.536620 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.537098 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.537352 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.0372992 +0000 UTC m=+48.048825985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.537680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.539156 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.039135105 +0000 UTC m=+48.050662120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.542304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" event={"ID":"9d16abc4-7306-4cf8-b173-ae9e007f4519","Type":"ContainerStarted","Data":"3e59f48c3ec2ec55233252e48336718f02bdeee1dcbdcdeed519928ae2872e44"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.542379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" event={"ID":"9d16abc4-7306-4cf8-b173-ae9e007f4519","Type":"ContainerStarted","Data":"42ad15e06cf35db854d1c56775fb159433db444fbce10ec5defc07dc71e59b3b"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.552441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n56c5" event={"ID":"90837a80-df80-457d-a061-ad3933620e19","Type":"ContainerStarted","Data":"ec0e03d44092ca180038406477e84c76f15f81d7356691c6c77b71110cd68b41"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.561343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" event={"ID":"c23f9f6a-3b73-4429-8f26-a8c5e79963c9","Type":"ContainerStarted","Data":"53cece0a0864f59ab673242bfbe8123526fd2c34bf83cd27a11301d3a3dfc3c4"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.562137 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-tms2k"] Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.565472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" event={"ID":"e3c3a6c4-f7e1-4b36-9b49-e9264fb44d5b","Type":"ContainerStarted","Data":"61b0e8eb993326731c668280b0431cc788000030efccece5732bd1da2d9cc31f"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.572822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" event={"ID":"af109a22-6845-4bb1-a272-ccfe19c51573","Type":"ContainerStarted","Data":"fca18fc31c1528b4f38164e916c753322415b81fe8b356a189dcda52b8e8a148"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.572862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" event={"ID":"af109a22-6845-4bb1-a272-ccfe19c51573","Type":"ContainerStarted","Data":"5f76b025a1d4d8a1a7eb08acf9dfeb5924020b0fac1ef46437fa203b51ceb7cb"} Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.583676 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.583742 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.604919 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g8k42 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.605176 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" podUID="1b4b2ca8-1515-43f9-81c5-613cf8c05f9f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.609625 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtw8d" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.632637 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" podStartSLOduration=22.632617511 podStartE2EDuration="22.632617511s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.630261613 +0000 UTC m=+47.641788398" watchObservedRunningTime="2026-01-28 18:36:19.632617511 +0000 UTC m=+47.644144286" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.633569 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-srdjk" podStartSLOduration=22.633562193 podStartE2EDuration="22.633562193s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.5821081 +0000 UTC m=+47.593634885" watchObservedRunningTime="2026-01-28 18:36:19.633562193 +0000 UTC m=+47.645088968" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.641242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.644203 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.144179745 +0000 UTC m=+48.155706520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.655229 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vddcm" podStartSLOduration=22.655209936 podStartE2EDuration="22.655209936s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.653416792 +0000 UTC m=+47.664943587" watchObservedRunningTime="2026-01-28 18:36:19.655209936 +0000 UTC m=+47.666736711" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.687561 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:19 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:19 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:19 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.687629 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.697650 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6ctqb" podStartSLOduration=22.697628717 podStartE2EDuration="22.697628717s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.697509964 +0000 UTC m=+47.709036749" watchObservedRunningTime="2026-01-28 18:36:19.697628717 +0000 UTC m=+47.709155492" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.727646 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-r2995" podStartSLOduration=22.727626695 podStartE2EDuration="22.727626695s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.726132748 +0000 UTC m=+47.737659523" watchObservedRunningTime="2026-01-28 18:36:19.727626695 +0000 UTC m=+47.739153480" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.743241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.743703 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.243691539 +0000 UTC m=+48.255218314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.791912 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8p6sp" podStartSLOduration=8.791895483 podStartE2EDuration="8.791895483s" podCreationTimestamp="2026-01-28 18:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.790894138 +0000 UTC m=+47.802420913" watchObservedRunningTime="2026-01-28 18:36:19.791895483 +0000 UTC m=+47.803422258" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.792098 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-tzdvz" podStartSLOduration=22.792093768 podStartE2EDuration="22.792093768s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.769167454 +0000 UTC m=+47.780694249" watchObservedRunningTime="2026-01-28 18:36:19.792093768 +0000 UTC m=+47.803620543" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.844407 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-6lqrz" podStartSLOduration=22.844390142 podStartE2EDuration="22.844390142s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.837893732 +0000 UTC m=+47.849420527" watchObservedRunningTime="2026-01-28 18:36:19.844390142 +0000 UTC m=+47.855916917" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.844947 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.845305 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.345288095 +0000 UTC m=+48.356814870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.906150 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gz9f7" podStartSLOduration=8.906135449 podStartE2EDuration="8.906135449s" podCreationTimestamp="2026-01-28 18:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.896618085 +0000 UTC m=+47.908144860" watchObservedRunningTime="2026-01-28 18:36:19.906135449 +0000 UTC m=+47.917662224" Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.949085 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:19 crc kubenswrapper[4749]: E0128 18:36:19.949641 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.449623287 +0000 UTC m=+48.461150052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:19 crc kubenswrapper[4749]: I0128 18:36:19.988567 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r56pg" podStartSLOduration=22.988544083 podStartE2EDuration="22.988544083s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:19.986126143 +0000 UTC m=+47.997652928" watchObservedRunningTime="2026-01-28 18:36:19.988544083 +0000 UTC m=+48.000070858" Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.050744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.051203 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.551181501 +0000 UTC m=+48.562708276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.152813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.153583 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.653567276 +0000 UTC m=+48.665094051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.254465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.254959 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.754939746 +0000 UTC m=+48.766466521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.356172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.356633 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.856618523 +0000 UTC m=+48.868145298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.457415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.457927 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:20.957898721 +0000 UTC m=+48.969425496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.559557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.560200 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.060178583 +0000 UTC m=+49.071705358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.579339 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" gracePeriod=30 Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.579918 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" event={"ID":"814398d6-5d05-4161-bd2e-3ff61d27f2c7","Type":"ContainerStarted","Data":"6bdcbece61e54b7bcc26d569843c8c370c77302d1f520e7c0919065eb5b99865"} Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.585691 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z74vl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.585744 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.588508 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.588568 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.588613 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-vddcm container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/readyz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.588690 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vddcm" podUID="d9468344-0b6e-436d-a92a-0d53ee9bb179" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/readyz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.633807 4749 csr.go:261] certificate signing request csr-p2d2f is approved, waiting to be issued Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.660919 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:20 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:20 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:20 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.660977 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.661446 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.661701 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.161669456 +0000 UTC m=+49.173196241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.663362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.665077 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.165053579 +0000 UTC m=+49.176580354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.701686 4749 csr.go:257] certificate signing request csr-p2d2f is issued Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.706512 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" podStartSLOduration=24.706484147 podStartE2EDuration="24.706484147s" podCreationTimestamp="2026-01-28 18:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:20.686871055 +0000 UTC m=+48.698397840" watchObservedRunningTime="2026-01-28 18:36:20.706484147 +0000 UTC m=+48.718010922" Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.767081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.767748 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.267468024 +0000 UTC m=+49.278994799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.868531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.868874 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.368858465 +0000 UTC m=+49.380385240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.970290 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.970428 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.470406488 +0000 UTC m=+49.481933263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:20 crc kubenswrapper[4749]: I0128 18:36:20.970823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:20 crc kubenswrapper[4749]: E0128 18:36:20.971107 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.471092826 +0000 UTC m=+49.482619601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.071580 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.071879 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.57183419 +0000 UTC m=+49.583360965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.071996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.072394 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.572375403 +0000 UTC m=+49.583902178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.173523 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.173703 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.673676511 +0000 UTC m=+49.685203276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.174033 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.174315 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.674302907 +0000 UTC m=+49.685829682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.275372 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.275654 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.775634805 +0000 UTC m=+49.787161580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.376729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.377098 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.877086298 +0000 UTC m=+49.888613073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.478257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.478664 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:21.978644281 +0000 UTC m=+49.990171066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.579561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.579913 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.079896609 +0000 UTC m=+50.091423384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.585771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" event={"ID":"814398d6-5d05-4161-bd2e-3ff61d27f2c7","Type":"ContainerStarted","Data":"4c2737c27dfe897d7b5597f1bc4e39f8d657e5f2dbb30948b49f7966195a0305"} Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.587622 4749 generic.go:334] "Generic (PLEG): container finished" podID="2b018047-b659-49f1-a494-aaf29e2925e3" containerID="d8aca8147ef4c290231677590ba0026f2ec718ce5a59b716814f7bbb578df6a0" exitCode=0 Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.588162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" event={"ID":"2b018047-b659-49f1-a494-aaf29e2925e3","Type":"ContainerDied","Data":"d8aca8147ef4c290231677590ba0026f2ec718ce5a59b716814f7bbb578df6a0"} Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.671991 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:21 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:21 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:21 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.672046 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.681105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.681750 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.181720759 +0000 UTC m=+50.193247534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.704629 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-28 18:31:20 +0000 UTC, rotation deadline is 2026-10-11 18:41:37.714972468 +0000 UTC Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.704663 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6144h5m16.010312335s for next certificate rotation Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.760344 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cn95r"] Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.761320 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.764968 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.783779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.784119 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.284103904 +0000 UTC m=+50.295630669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.793024 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cn95r"] Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.885992 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.886170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn59z\" (UniqueName: \"kubernetes.io/projected/01642193-d926-44c5-908a-98716476032b-kube-api-access-hn59z\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.886235 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-catalog-content\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.886267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-utilities\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.886407 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.386391646 +0000 UTC m=+50.397918421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.988463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-utilities\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.988563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn59z\" (UniqueName: \"kubernetes.io/projected/01642193-d926-44c5-908a-98716476032b-kube-api-access-hn59z\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.988607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.988641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-catalog-content\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.989144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-catalog-content\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: I0128 18:36:21.989419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-utilities\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:21 crc kubenswrapper[4749]: E0128 18:36:21.990085 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.490070843 +0000 UTC m=+50.501597618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.075500 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn59z\" (UniqueName: \"kubernetes.io/projected/01642193-d926-44c5-908a-98716476032b-kube-api-access-hn59z\") pod \"certified-operators-cn95r\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.080172 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.090068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.090585 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.590558971 +0000 UTC m=+50.602085756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.188563 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4ngh4"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.190138 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.192423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.192913 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.692896294 +0000 UTC m=+50.704423069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.220032 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.221076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.245293 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4ngh4"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.259040 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.260882 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.278414 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.296942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.297074 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxqx\" (UniqueName: \"kubernetes.io/projected/b76a4267-3557-4246-b3dc-84a610d9fbd4-kube-api-access-jqxqx\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.297101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb263a5f-e760-4afa-9f5e-9138a17957d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb263a5f-e760-4afa-9f5e-9138a17957d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.297121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb263a5f-e760-4afa-9f5e-9138a17957d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb263a5f-e760-4afa-9f5e-9138a17957d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.297194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-catalog-content\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.297220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-utilities\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.297347 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.797311739 +0000 UTC m=+50.808838514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.403042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxqx\" (UniqueName: \"kubernetes.io/projected/b76a4267-3557-4246-b3dc-84a610d9fbd4-kube-api-access-jqxqx\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.407508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb263a5f-e760-4afa-9f5e-9138a17957d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb263a5f-e760-4afa-9f5e-9138a17957d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.407548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb263a5f-e760-4afa-9f5e-9138a17957d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb263a5f-e760-4afa-9f5e-9138a17957d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.407714 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.407770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-catalog-content\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.407828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-utilities\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.411687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb263a5f-e760-4afa-9f5e-9138a17957d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb263a5f-e760-4afa-9f5e-9138a17957d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.412011 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:22.911995746 +0000 UTC m=+50.923522521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.412569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-catalog-content\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.421308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-utilities\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.443015 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rhv8w"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.445077 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.453267 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxqx\" (UniqueName: \"kubernetes.io/projected/b76a4267-3557-4246-b3dc-84a610d9fbd4-kube-api-access-jqxqx\") pod \"certified-operators-4ngh4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.464577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb263a5f-e760-4afa-9f5e-9138a17957d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb263a5f-e760-4afa-9f5e-9138a17957d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.471051 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.484101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhv8w"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.521396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.526888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.527059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2dl2\" (UniqueName: \"kubernetes.io/projected/ffc6bd30-5803-4a01-a711-4f3b3c718750-kube-api-access-z2dl2\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.527095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-utilities\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.527175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-catalog-content\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.527358 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:23.027300508 +0000 UTC m=+51.038827343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.538397 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jbrpd"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.539762 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.558860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbrpd"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.563654 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.628954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-utilities\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.629047 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-catalog-content\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.629086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.629124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fdk\" (UniqueName: \"kubernetes.io/projected/c03b812d-6833-4c65-887b-0fa0a6c1227a-kube-api-access-n9fdk\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.629163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dl2\" (UniqueName: \"kubernetes.io/projected/ffc6bd30-5803-4a01-a711-4f3b3c718750-kube-api-access-z2dl2\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.629192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-utilities\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.629246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-catalog-content\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.629774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-catalog-content\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.630111 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:23.130093793 +0000 UTC m=+51.141620568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.635568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-utilities\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.657141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2dl2\" (UniqueName: \"kubernetes.io/projected/ffc6bd30-5803-4a01-a711-4f3b3c718750-kube-api-access-z2dl2\") pod \"community-operators-rhv8w\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.658123 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:22 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:22 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:22 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.658172 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.675052 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cn95r"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.730958 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.731319 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-catalog-content\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.731409 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:23.231368171 +0000 UTC m=+51.242894946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.731461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.731521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fdk\" (UniqueName: \"kubernetes.io/projected/c03b812d-6833-4c65-887b-0fa0a6c1227a-kube-api-access-n9fdk\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.731632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-utilities\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.731851 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-catalog-content\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.731990 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-utilities\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.734832 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:23.234791244 +0000 UTC m=+51.246318019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.751823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fdk\" (UniqueName: \"kubernetes.io/projected/c03b812d-6833-4c65-887b-0fa0a6c1227a-kube-api-access-n9fdk\") pod \"community-operators-jbrpd\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.755189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" event={"ID":"814398d6-5d05-4161-bd2e-3ff61d27f2c7","Type":"ContainerStarted","Data":"5121c5e8d8bcaebba9f18f3a2a30aea676e4661f138126d5ce2350f381dac684"} Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.811248 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4ngh4"] Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.829982 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.831884 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.832126 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 18:36:23.332109315 +0000 UTC m=+51.343636090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.848898 4749 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.868703 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 18:36:22 crc kubenswrapper[4749]: W0128 18:36:22.887922 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbb263a5f_e760_4afa_9f5e_9138a17957d5.slice/crio-9cd43352162bdb5ab45a36ef1b7fed7655570651bc6188e80ad12d321512ed12 WatchSource:0}: Error finding container 9cd43352162bdb5ab45a36ef1b7fed7655570651bc6188e80ad12d321512ed12: Status 404 returned error can't find the container with id 9cd43352162bdb5ab45a36ef1b7fed7655570651bc6188e80ad12d321512ed12 Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.890164 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kx92t" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.917708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.933592 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:22 crc kubenswrapper[4749]: E0128 18:36:22.933878 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 18:36:23.433851414 +0000 UTC m=+51.445378179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5hd" (UID: "a45db790-79fe-4928-b701-d64737024f60") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.988093 4749 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-28T18:36:22.848920678Z","Handler":null,"Name":""} Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.992387 4749 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 28 18:36:22 crc kubenswrapper[4749]: I0128 18:36:22.992432 4749 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.034904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.084779 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhv8w"] Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.282215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.331221 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.339164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.427424 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.427871 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.440553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b018047-b659-49f1-a494-aaf29e2925e3-secret-volume\") pod \"2b018047-b659-49f1-a494-aaf29e2925e3\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.440644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b018047-b659-49f1-a494-aaf29e2925e3-config-volume\") pod \"2b018047-b659-49f1-a494-aaf29e2925e3\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.440720 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5954\" (UniqueName: \"kubernetes.io/projected/2b018047-b659-49f1-a494-aaf29e2925e3-kube-api-access-r5954\") pod \"2b018047-b659-49f1-a494-aaf29e2925e3\" (UID: \"2b018047-b659-49f1-a494-aaf29e2925e3\") " Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.441885 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b018047-b659-49f1-a494-aaf29e2925e3-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b018047-b659-49f1-a494-aaf29e2925e3" (UID: "2b018047-b659-49f1-a494-aaf29e2925e3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.449938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b018047-b659-49f1-a494-aaf29e2925e3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b018047-b659-49f1-a494-aaf29e2925e3" (UID: "2b018047-b659-49f1-a494-aaf29e2925e3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.450095 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b018047-b659-49f1-a494-aaf29e2925e3-kube-api-access-r5954" (OuterVolumeSpecName: "kube-api-access-r5954") pod "2b018047-b659-49f1-a494-aaf29e2925e3" (UID: "2b018047-b659-49f1-a494-aaf29e2925e3"). InnerVolumeSpecName "kube-api-access-r5954". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.469181 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5hd\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.544106 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b018047-b659-49f1-a494-aaf29e2925e3-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.544758 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b018047-b659-49f1-a494-aaf29e2925e3-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.544770 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5954\" (UniqueName: \"kubernetes.io/projected/2b018047-b659-49f1-a494-aaf29e2925e3-kube-api-access-r5954\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.569109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.612149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbrpd"] Jan 28 18:36:23 crc kubenswrapper[4749]: W0128 18:36:23.619272 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc03b812d_6833_4c65_887b_0fa0a6c1227a.slice/crio-c2676b1bdc8e15da808e143afad06ea893a551e5b5eb8ba2bc4accbd785238d0 WatchSource:0}: Error finding container c2676b1bdc8e15da808e143afad06ea893a551e5b5eb8ba2bc4accbd785238d0: Status 404 returned error can't find the container with id c2676b1bdc8e15da808e143afad06ea893a551e5b5eb8ba2bc4accbd785238d0 Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.666985 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:23 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:23 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:23 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.667074 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.776632 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ngh4" event={"ID":"b76a4267-3557-4246-b3dc-84a610d9fbd4","Type":"ContainerDied","Data":"975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.776595 4749 generic.go:334] "Generic (PLEG): container finished" podID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerID="975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4" exitCode=0 Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.777025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ngh4" event={"ID":"b76a4267-3557-4246-b3dc-84a610d9fbd4","Type":"ContainerStarted","Data":"c400e76b96fe5d4ae74e4b919c2f2055142c6506f9ffb1475edb2af085e0b5b8"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.778820 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.778825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj" event={"ID":"2b018047-b659-49f1-a494-aaf29e2925e3","Type":"ContainerDied","Data":"9fd350c98028a35487d88ffa0cbb702595f42a89a2d127287c1a37197d3b0592"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.778860 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd350c98028a35487d88ffa0cbb702595f42a89a2d127287c1a37197d3b0592" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.779793 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.780693 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bb263a5f-e760-4afa-9f5e-9138a17957d5","Type":"ContainerStarted","Data":"9cd43352162bdb5ab45a36ef1b7fed7655570651bc6188e80ad12d321512ed12"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.784294 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerID="9ecb988b34f11480980f3689d27f4f5957c46410631b7330361ef12093692f7a" exitCode=0 Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.784444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhv8w" event={"ID":"ffc6bd30-5803-4a01-a711-4f3b3c718750","Type":"ContainerDied","Data":"9ecb988b34f11480980f3689d27f4f5957c46410631b7330361ef12093692f7a"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.784476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhv8w" event={"ID":"ffc6bd30-5803-4a01-a711-4f3b3c718750","Type":"ContainerStarted","Data":"6190ed0d15e035b8f99b7d42404e79e9db8be2be771d0fb977123110d969174b"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.789032 4749 generic.go:334] "Generic (PLEG): container finished" podID="01642193-d926-44c5-908a-98716476032b" containerID="f24936f5a1c0c6d4f9193af6249231de12f825866b02fde2b77557e4a585f37b" exitCode=0 Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.789123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn95r" event={"ID":"01642193-d926-44c5-908a-98716476032b","Type":"ContainerDied","Data":"f24936f5a1c0c6d4f9193af6249231de12f825866b02fde2b77557e4a585f37b"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.789156 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn95r" event={"ID":"01642193-d926-44c5-908a-98716476032b","Type":"ContainerStarted","Data":"473270e3aa4e346f2ad4b6fd84b7d97b5864bba706628d79cf5e273f0ec0ac7b"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.794487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbrpd" event={"ID":"c03b812d-6833-4c65-887b-0fa0a6c1227a","Type":"ContainerStarted","Data":"c2676b1bdc8e15da808e143afad06ea893a551e5b5eb8ba2bc4accbd785238d0"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.809786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" event={"ID":"814398d6-5d05-4161-bd2e-3ff61d27f2c7","Type":"ContainerStarted","Data":"768acf1942675b613ba7413d0e5e3e2b9f7c36924577d4d59fe21108db5bfcef"} Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.810875 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5hd"] Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.864045 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xx5t5" podStartSLOduration=12.86402691 podStartE2EDuration="12.86402691s" podCreationTimestamp="2026-01-28 18:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:23.862157005 +0000 UTC m=+51.873683790" watchObservedRunningTime="2026-01-28 18:36:23.86402691 +0000 UTC m=+51.875553685" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.870725 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.896046 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.928585 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.929577 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.946765 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.952442 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:23 crc kubenswrapper[4749]: I0128 18:36:23.961346 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.133949 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-whzp4"] Jan 28 18:36:24 crc kubenswrapper[4749]: E0128 18:36:24.134164 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b018047-b659-49f1-a494-aaf29e2925e3" containerName="collect-profiles" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.134177 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b018047-b659-49f1-a494-aaf29e2925e3" containerName="collect-profiles" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.134276 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b018047-b659-49f1-a494-aaf29e2925e3" containerName="collect-profiles" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.135000 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.137841 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.153640 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whzp4"] Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.154835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-utilities\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.154905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsqlx\" (UniqueName: \"kubernetes.io/projected/5239a0f8-12de-4979-af6c-d209a21bc067-kube-api-access-vsqlx\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.154940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-catalog-content\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.213108 4749 patch_prober.go:28] interesting pod/apiserver-76f77b778f-dhdg9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]log ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]etcd ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/generic-apiserver-start-informers ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/max-in-flight-filter ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 28 18:36:24 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 28 18:36:24 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectcache ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-startinformers ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 28 18:36:24 crc kubenswrapper[4749]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 28 18:36:24 crc kubenswrapper[4749]: livez check failed Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.213186 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" podUID="f1f9743c-bcb4-47c0-8c0d-23530ecca520" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.256479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-utilities\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.256577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsqlx\" (UniqueName: \"kubernetes.io/projected/5239a0f8-12de-4979-af6c-d209a21bc067-kube-api-access-vsqlx\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.256611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-catalog-content\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.257301 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-utilities\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.257320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-catalog-content\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.278382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsqlx\" (UniqueName: \"kubernetes.io/projected/5239a0f8-12de-4979-af6c-d209a21bc067-kube-api-access-vsqlx\") pod \"redhat-marketplace-whzp4\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.369377 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.369447 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.370707 4749 patch_prober.go:28] interesting pod/console-f9d7485db-kjr8m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.370767 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kjr8m" podUID="89157053-d5d1-40f0-8b36-411d637d8385" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.431547 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g8k42" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.452642 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.535084 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tndcr"] Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.536243 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.557578 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tndcr"] Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.561873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-utilities\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.562021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzcsm\" (UniqueName: \"kubernetes.io/projected/9ec3b90d-1483-485e-ab0a-52af455cc9ea-kube-api-access-zzcsm\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.562049 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-catalog-content\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.562689 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.562728 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.562727 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.562775 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.652386 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.660568 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:24 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:24 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:24 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.660628 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.663209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.664595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.664672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.664884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzcsm\" (UniqueName: \"kubernetes.io/projected/9ec3b90d-1483-485e-ab0a-52af455cc9ea-kube-api-access-zzcsm\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.664957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-catalog-content\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.664988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.665044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-utilities\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.666224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-catalog-content\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.668271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.669014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-utilities\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.670377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.670774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.673317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:24 crc kubenswrapper[4749]: E0128 18:36:24.681746 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.683281 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 18:36:24 crc kubenswrapper[4749]: E0128 18:36:24.688759 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.690686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzcsm\" (UniqueName: \"kubernetes.io/projected/9ec3b90d-1483-485e-ab0a-52af455cc9ea-kube-api-access-zzcsm\") pod \"redhat-marketplace-tndcr\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: E0128 18:36:24.692526 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:24 crc kubenswrapper[4749]: E0128 18:36:24.692577 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.804703 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whzp4"] Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.838249 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.840645 4749 generic.go:334] "Generic (PLEG): container finished" podID="bb263a5f-e760-4afa-9f5e-9138a17957d5" containerID="e083ebcbbc4561fae5a84208e18998d1d80828d7f4243a742a23dedc30b60f5f" exitCode=0 Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.840890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bb263a5f-e760-4afa-9f5e-9138a17957d5","Type":"ContainerDied","Data":"e083ebcbbc4561fae5a84208e18998d1d80828d7f4243a742a23dedc30b60f5f"} Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.871104 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.883952 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.888857 4749 generic.go:334] "Generic (PLEG): container finished" podID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerID="d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7" exitCode=0 Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.899678 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.900506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbrpd" event={"ID":"c03b812d-6833-4c65-887b-0fa0a6c1227a","Type":"ContainerDied","Data":"d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7"} Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.900537 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.900563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" event={"ID":"a45db790-79fe-4928-b701-d64737024f60","Type":"ContainerStarted","Data":"4187963be65e0224661c02b7690c5bcb483dd4bdc6ffc084d2c7850ed17d534b"} Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.900573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" event={"ID":"a45db790-79fe-4928-b701-d64737024f60","Type":"ContainerStarted","Data":"ccd294db1970901ef31362870de5455fcdf090ae9e6410a192c924efc2a34dea"} Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.913187 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5jdv" Jan 28 18:36:24 crc kubenswrapper[4749]: I0128 18:36:24.942417 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" podStartSLOduration=27.942396977 podStartE2EDuration="27.942396977s" podCreationTimestamp="2026-01-28 18:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:24.939423844 +0000 UTC m=+52.950950639" watchObservedRunningTime="2026-01-28 18:36:24.942396977 +0000 UTC m=+52.953923752" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.163905 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mk2w9"] Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.165276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.195472 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.229116 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mk2w9"] Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.252646 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.253840 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vddcm" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.297308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-utilities\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.297400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-catalog-content\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.297430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfktk\" (UniqueName: \"kubernetes.io/projected/ebc65c41-25f8-4ee9-9993-a00101a35397-kube-api-access-tfktk\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.398201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-utilities\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.398533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-catalog-content\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.398555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfktk\" (UniqueName: \"kubernetes.io/projected/ebc65c41-25f8-4ee9-9993-a00101a35397-kube-api-access-tfktk\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.399445 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-catalog-content\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.399530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-utilities\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.428904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfktk\" (UniqueName: \"kubernetes.io/projected/ebc65c41-25f8-4ee9-9993-a00101a35397-kube-api-access-tfktk\") pod \"redhat-operators-mk2w9\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.533693 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.545075 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.550452 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.550743 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.552646 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.552689 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-25wll"] Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.554037 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.557703 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.559949 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25wll"] Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.663771 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:25 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:25 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:25 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.664557 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.703145 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-catalog-content\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.703202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl9kb\" (UniqueName: \"kubernetes.io/projected/3dcb2d5f-2613-4277-bba9-89e404c7832a-kube-api-access-zl9kb\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.703223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.703529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-utilities\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.703554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.805196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-utilities\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.805244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.805308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-catalog-content\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.805374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl9kb\" (UniqueName: \"kubernetes.io/projected/3dcb2d5f-2613-4277-bba9-89e404c7832a-kube-api-access-zl9kb\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.805406 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.805525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.805841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-catalog-content\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.806481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-utilities\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.828642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl9kb\" (UniqueName: \"kubernetes.io/projected/3dcb2d5f-2613-4277-bba9-89e404c7832a-kube-api-access-zl9kb\") pod \"redhat-operators-25wll\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.829775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.897093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.907594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ce0c0defe1ed09a6c0a45358c29ff8f9ebc2be50aaf06e39abd59ee0ad8697dc"} Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.907652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9e4828e3aa15fb603926543490bbe7f8574f005810467ad8ce11ad50808dc9d0"} Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.907819 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.910473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b46e894d530f4f7861d34eb9e65fc78349220062a633cdd3f054a7a58bdb5d99"} Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.910509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8152d9503675e321a27d9e13aa367de6d3a52ba3b005de7c54b942f5f0b264e9"} Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.916532 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.921002 4749 generic.go:334] "Generic (PLEG): container finished" podID="5239a0f8-12de-4979-af6c-d209a21bc067" containerID="479b9e045041ee86eb5ea6ecd7cc41be3b19fa92143b3e2181fe434bb6d360a7" exitCode=0 Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.922151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whzp4" event={"ID":"5239a0f8-12de-4979-af6c-d209a21bc067","Type":"ContainerDied","Data":"479b9e045041ee86eb5ea6ecd7cc41be3b19fa92143b3e2181fe434bb6d360a7"} Jan 28 18:36:25 crc kubenswrapper[4749]: I0128 18:36:25.922189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whzp4" event={"ID":"5239a0f8-12de-4979-af6c-d209a21bc067","Type":"ContainerStarted","Data":"32176ac25e3067a7ea471a8c548fa31d069a3cf8cdc0a6b54e129d70db92b5a1"} Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.005349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mk2w9"] Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.051545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tndcr"] Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.188298 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.218733 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.220465 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.242158 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.287250 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.287231298 podStartE2EDuration="287.231298ms" podCreationTimestamp="2026-01-28 18:36:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:26.280095732 +0000 UTC m=+54.291622517" watchObservedRunningTime="2026-01-28 18:36:26.287231298 +0000 UTC m=+54.298758073" Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.312522 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25wll"] Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.315703 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb263a5f-e760-4afa-9f5e-9138a17957d5-kube-api-access\") pod \"bb263a5f-e760-4afa-9f5e-9138a17957d5\" (UID: \"bb263a5f-e760-4afa-9f5e-9138a17957d5\") " Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.324011 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb263a5f-e760-4afa-9f5e-9138a17957d5-kubelet-dir\") pod \"bb263a5f-e760-4afa-9f5e-9138a17957d5\" (UID: \"bb263a5f-e760-4afa-9f5e-9138a17957d5\") " Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.324417 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb263a5f-e760-4afa-9f5e-9138a17957d5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bb263a5f-e760-4afa-9f5e-9138a17957d5" (UID: "bb263a5f-e760-4afa-9f5e-9138a17957d5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.330872 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb263a5f-e760-4afa-9f5e-9138a17957d5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bb263a5f-e760-4afa-9f5e-9138a17957d5" (UID: "bb263a5f-e760-4afa-9f5e-9138a17957d5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:36:26 crc kubenswrapper[4749]: W0128 18:36:26.388470 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcb2d5f_2613_4277_bba9_89e404c7832a.slice/crio-1abd0ad8e9f50c9e1339a0c45af704bfdcc1c9cf70b8e7707af08322876c469e WatchSource:0}: Error finding container 1abd0ad8e9f50c9e1339a0c45af704bfdcc1c9cf70b8e7707af08322876c469e: Status 404 returned error can't find the container with id 1abd0ad8e9f50c9e1339a0c45af704bfdcc1c9cf70b8e7707af08322876c469e Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.425689 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb263a5f-e760-4afa-9f5e-9138a17957d5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.425730 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb263a5f-e760-4afa-9f5e-9138a17957d5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.657477 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:26 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:26 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:26 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.657528 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.969234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c361fbc9-21ee-4d2a-9353-93aad7207c5d","Type":"ContainerStarted","Data":"e54c58035b1f58d9fc52ab522585ec1b9de3d26b7a26efe384eb4f35ac016558"} Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.969735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c361fbc9-21ee-4d2a-9353-93aad7207c5d","Type":"ContainerStarted","Data":"8e38a4a77fe7c016d9b862f56a64130c729bcdbd307dff3d613762e4077297ab"} Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.987260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a31bae013432f6ce973c0a5b18f47a708bc14966cd856a5c72aad99dd57d463b"} Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.987312 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5ab82eefbd0054a1b470e4e2c802f41b7338337f9147108bf3753c6ae8e9a3a8"} Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.994121 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerID="1b6c37df7f9cd31fb18e60ed1e6daaef397ab7ecc7f319663f0e1a140e440455" exitCode=0 Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.994241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2w9" event={"ID":"ebc65c41-25f8-4ee9-9993-a00101a35397","Type":"ContainerDied","Data":"1b6c37df7f9cd31fb18e60ed1e6daaef397ab7ecc7f319663f0e1a140e440455"} Jan 28 18:36:26 crc kubenswrapper[4749]: I0128 18:36:26.994270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2w9" event={"ID":"ebc65c41-25f8-4ee9-9993-a00101a35397","Type":"ContainerStarted","Data":"cb96efea370f5b5e5b94e69d2e334c99fa82345b0b8f9bf202826805f0418416"} Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.000670 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerID="e1c4945f86f71a02ad519381dadfaecc9c829d3847a629eeca06e89d2ae803de" exitCode=0 Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.000752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tndcr" event={"ID":"9ec3b90d-1483-485e-ab0a-52af455cc9ea","Type":"ContainerDied","Data":"e1c4945f86f71a02ad519381dadfaecc9c829d3847a629eeca06e89d2ae803de"} Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.000781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tndcr" event={"ID":"9ec3b90d-1483-485e-ab0a-52af455cc9ea","Type":"ContainerStarted","Data":"85e49479d2193e3ee625d0350eebd144f266d1c27341a4c40cfe299e93828aee"} Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.005897 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerID="7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a" exitCode=0 Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.005944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25wll" event={"ID":"3dcb2d5f-2613-4277-bba9-89e404c7832a","Type":"ContainerDied","Data":"7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a"} Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.006062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25wll" event={"ID":"3dcb2d5f-2613-4277-bba9-89e404c7832a","Type":"ContainerStarted","Data":"1abd0ad8e9f50c9e1339a0c45af704bfdcc1c9cf70b8e7707af08322876c469e"} Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.011302 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.011926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bb263a5f-e760-4afa-9f5e-9138a17957d5","Type":"ContainerDied","Data":"9cd43352162bdb5ab45a36ef1b7fed7655570651bc6188e80ad12d321512ed12"} Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.012008 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd43352162bdb5ab45a36ef1b7fed7655570651bc6188e80ad12d321512ed12" Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.655505 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:27 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:27 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:27 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:27 crc kubenswrapper[4749]: I0128 18:36:27.655582 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:28 crc kubenswrapper[4749]: I0128 18:36:28.030257 4749 generic.go:334] "Generic (PLEG): container finished" podID="c361fbc9-21ee-4d2a-9353-93aad7207c5d" containerID="e54c58035b1f58d9fc52ab522585ec1b9de3d26b7a26efe384eb4f35ac016558" exitCode=0 Jan 28 18:36:28 crc kubenswrapper[4749]: I0128 18:36:28.030421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c361fbc9-21ee-4d2a-9353-93aad7207c5d","Type":"ContainerDied","Data":"e54c58035b1f58d9fc52ab522585ec1b9de3d26b7a26efe384eb4f35ac016558"} Jan 28 18:36:28 crc kubenswrapper[4749]: I0128 18:36:28.653552 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:28 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:28 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:28 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:28 crc kubenswrapper[4749]: I0128 18:36:28.653637 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:28 crc kubenswrapper[4749]: I0128 18:36:28.957209 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:28 crc kubenswrapper[4749]: I0128 18:36:28.964126 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-dhdg9" Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.395218 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.477695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kubelet-dir\") pod \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\" (UID: \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\") " Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.477760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kube-api-access\") pod \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\" (UID: \"c361fbc9-21ee-4d2a-9353-93aad7207c5d\") " Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.478911 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c361fbc9-21ee-4d2a-9353-93aad7207c5d" (UID: "c361fbc9-21ee-4d2a-9353-93aad7207c5d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.493241 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c361fbc9-21ee-4d2a-9353-93aad7207c5d" (UID: "c361fbc9-21ee-4d2a-9353-93aad7207c5d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.578864 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.578901 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c361fbc9-21ee-4d2a-9353-93aad7207c5d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.655580 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:29 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:29 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:29 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.655651 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:29 crc kubenswrapper[4749]: I0128 18:36:29.691418 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gz9f7" Jan 28 18:36:30 crc kubenswrapper[4749]: I0128 18:36:30.083840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c361fbc9-21ee-4d2a-9353-93aad7207c5d","Type":"ContainerDied","Data":"8e38a4a77fe7c016d9b862f56a64130c729bcdbd307dff3d613762e4077297ab"} Jan 28 18:36:30 crc kubenswrapper[4749]: I0128 18:36:30.083872 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 18:36:30 crc kubenswrapper[4749]: I0128 18:36:30.083885 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e38a4a77fe7c016d9b862f56a64130c729bcdbd307dff3d613762e4077297ab" Jan 28 18:36:30 crc kubenswrapper[4749]: I0128 18:36:30.659000 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:30 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:30 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:30 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:30 crc kubenswrapper[4749]: I0128 18:36:30.659055 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:31 crc kubenswrapper[4749]: I0128 18:36:31.653185 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:31 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:31 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:31 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:31 crc kubenswrapper[4749]: I0128 18:36:31.653442 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:32 crc kubenswrapper[4749]: I0128 18:36:32.656888 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:32 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:32 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:32 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:32 crc kubenswrapper[4749]: I0128 18:36:32.657000 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:33 crc kubenswrapper[4749]: I0128 18:36:33.666639 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:33 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:33 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:33 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:33 crc kubenswrapper[4749]: I0128 18:36:33.667196 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:34 crc kubenswrapper[4749]: I0128 18:36:34.369697 4749 patch_prober.go:28] interesting pod/console-f9d7485db-kjr8m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 28 18:36:34 crc kubenswrapper[4749]: I0128 18:36:34.369755 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kjr8m" podUID="89157053-d5d1-40f0-8b36-411d637d8385" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 28 18:36:34 crc kubenswrapper[4749]: I0128 18:36:34.563290 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:34 crc kubenswrapper[4749]: I0128 18:36:34.563370 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:34 crc kubenswrapper[4749]: I0128 18:36:34.563429 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:34 crc kubenswrapper[4749]: I0128 18:36:34.563483 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:34 crc kubenswrapper[4749]: E0128 18:36:34.642409 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:34 crc kubenswrapper[4749]: E0128 18:36:34.645220 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:34 crc kubenswrapper[4749]: I0128 18:36:34.653519 4749 patch_prober.go:28] interesting pod/router-default-5444994796-pdnfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 18:36:34 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Jan 28 18:36:34 crc kubenswrapper[4749]: [+]process-running ok Jan 28 18:36:34 crc kubenswrapper[4749]: healthz check failed Jan 28 18:36:34 crc kubenswrapper[4749]: I0128 18:36:34.653709 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pdnfq" podUID="9ed379e4-88c7-479e-9005-c7980ba50ccd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 18:36:34 crc kubenswrapper[4749]: E0128 18:36:34.657662 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:34 crc kubenswrapper[4749]: E0128 18:36:34.657787 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" Jan 28 18:36:35 crc kubenswrapper[4749]: I0128 18:36:35.657184 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:35 crc kubenswrapper[4749]: I0128 18:36:35.675153 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pdnfq" Jan 28 18:36:40 crc kubenswrapper[4749]: I0128 18:36:40.594679 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqtw5"] Jan 28 18:36:40 crc kubenswrapper[4749]: I0128 18:36:40.595472 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" containerID="cri-o://5573b792054a4fc452aea653e05c6446b98b91b16923cd5446799b23ffc251b6" gracePeriod=30 Jan 28 18:36:40 crc kubenswrapper[4749]: I0128 18:36:40.689129 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z"] Jan 28 18:36:40 crc kubenswrapper[4749]: I0128 18:36:40.689382 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" podUID="e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" containerName="route-controller-manager" containerID="cri-o://c1f0f7551a25556324220ec1025cbfe2e98b52191b4dd409348412cccc6d7163" gracePeriod=30 Jan 28 18:36:40 crc kubenswrapper[4749]: I0128 18:36:40.886524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 28 18:36:42 crc kubenswrapper[4749]: I0128 18:36:42.210500 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerID="5573b792054a4fc452aea653e05c6446b98b91b16923cd5446799b23ffc251b6" exitCode=0 Jan 28 18:36:42 crc kubenswrapper[4749]: I0128 18:36:42.210590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" event={"ID":"4f3a2df0-0830-4e78-a168-31171cf06b76","Type":"ContainerDied","Data":"5573b792054a4fc452aea653e05c6446b98b91b16923cd5446799b23ffc251b6"} Jan 28 18:36:42 crc kubenswrapper[4749]: I0128 18:36:42.212012 4749 generic.go:334] "Generic (PLEG): container finished" podID="e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" containerID="c1f0f7551a25556324220ec1025cbfe2e98b52191b4dd409348412cccc6d7163" exitCode=0 Jan 28 18:36:42 crc kubenswrapper[4749]: I0128 18:36:42.212033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" event={"ID":"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e","Type":"ContainerDied","Data":"c1f0f7551a25556324220ec1025cbfe2e98b52191b4dd409348412cccc6d7163"} Jan 28 18:36:42 crc kubenswrapper[4749]: I0128 18:36:42.895337 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.895308317 podStartE2EDuration="2.895308317s" podCreationTimestamp="2026-01-28 18:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:36:42.894717593 +0000 UTC m=+70.906244368" watchObservedRunningTime="2026-01-28 18:36:42.895308317 +0000 UTC m=+70.906835092" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.101840 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.132226 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t"] Jan 28 18:36:43 crc kubenswrapper[4749]: E0128 18:36:43.132514 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c361fbc9-21ee-4d2a-9353-93aad7207c5d" containerName="pruner" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.132530 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c361fbc9-21ee-4d2a-9353-93aad7207c5d" containerName="pruner" Jan 28 18:36:43 crc kubenswrapper[4749]: E0128 18:36:43.132545 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" containerName="route-controller-manager" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.132552 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" containerName="route-controller-manager" Jan 28 18:36:43 crc kubenswrapper[4749]: E0128 18:36:43.132570 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb263a5f-e760-4afa-9f5e-9138a17957d5" containerName="pruner" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.132577 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb263a5f-e760-4afa-9f5e-9138a17957d5" containerName="pruner" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.132688 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c361fbc9-21ee-4d2a-9353-93aad7207c5d" containerName="pruner" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.132704 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" containerName="route-controller-manager" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.132714 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb263a5f-e760-4afa-9f5e-9138a17957d5" containerName="pruner" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.133102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.142891 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t"] Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.219930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" event={"ID":"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e","Type":"ContainerDied","Data":"14f6a585a191996c28599333710916de8f8724e55e58c7c6f0fd7238028e59ee"} Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.220018 4749 scope.go:117] "RemoveContainer" containerID="c1f0f7551a25556324220ec1025cbfe2e98b52191b4dd409348412cccc6d7163" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.220158 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.238845 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-client-ca\") pod \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.238913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-serving-cert\") pod \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.239018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-config\") pod \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.239098 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r6d2\" (UniqueName: \"kubernetes.io/projected/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-kube-api-access-9r6d2\") pod \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\" (UID: \"e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e\") " Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.239528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d30b2a-69ba-4c36-b67c-dad48a315d96-serving-cert\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.239595 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvfcz\" (UniqueName: \"kubernetes.io/projected/73d30b2a-69ba-4c36-b67c-dad48a315d96-kube-api-access-wvfcz\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.239648 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-client-ca\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.239691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-config\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.246375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-config" (OuterVolumeSpecName: "config") pod "e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" (UID: "e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.246350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-client-ca" (OuterVolumeSpecName: "client-ca") pod "e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" (UID: "e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.248068 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" (UID: "e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.249503 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-kube-api-access-9r6d2" (OuterVolumeSpecName: "kube-api-access-9r6d2") pod "e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" (UID: "e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e"). InnerVolumeSpecName "kube-api-access-9r6d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.340614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-client-ca\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.340710 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-config\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.340753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d30b2a-69ba-4c36-b67c-dad48a315d96-serving-cert\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.340787 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvfcz\" (UniqueName: \"kubernetes.io/projected/73d30b2a-69ba-4c36-b67c-dad48a315d96-kube-api-access-wvfcz\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.340829 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.340839 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.340847 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.340856 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r6d2\" (UniqueName: \"kubernetes.io/projected/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e-kube-api-access-9r6d2\") on node \"crc\" DevicePath \"\"" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.342155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-client-ca\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.342290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-config\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.344931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d30b2a-69ba-4c36-b67c-dad48a315d96-serving-cert\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.359136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvfcz\" (UniqueName: \"kubernetes.io/projected/73d30b2a-69ba-4c36-b67c-dad48a315d96-kube-api-access-wvfcz\") pod \"route-controller-manager-57d96b979-wwx9t\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.489169 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.548495 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z"] Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.552247 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mq6z"] Jan 28 18:36:43 crc kubenswrapper[4749]: I0128 18:36:43.577597 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.379945 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.386849 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.562448 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.562515 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.562557 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.562917 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"97d3a1b0b0a6f64466783daecd9a3fd858e509354735dcc37f2df8c44538601d"} pod="openshift-console/downloads-7954f5f757-8tmng" containerMessage="Container download-server failed liveness probe, will be restarted" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.563088 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" containerID="cri-o://97d3a1b0b0a6f64466783daecd9a3fd858e509354735dcc37f2df8c44538601d" gracePeriod=2 Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.564361 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.564389 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.565819 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.565867 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:44 crc kubenswrapper[4749]: E0128 18:36:44.641774 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:44 crc kubenswrapper[4749]: E0128 18:36:44.643301 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:44 crc kubenswrapper[4749]: E0128 18:36:44.644611 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:44 crc kubenswrapper[4749]: E0128 18:36:44.644651 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.864209 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pqtw5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.864277 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 18:36:44 crc kubenswrapper[4749]: I0128 18:36:44.880937 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e" path="/var/lib/kubelet/pods/e71f5834-fb23-4bcf-a7de-5b1e6e7cf34e/volumes" Jan 28 18:36:46 crc kubenswrapper[4749]: I0128 18:36:46.251374 4749 generic.go:334] "Generic (PLEG): container finished" podID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerID="97d3a1b0b0a6f64466783daecd9a3fd858e509354735dcc37f2df8c44538601d" exitCode=0 Jan 28 18:36:46 crc kubenswrapper[4749]: I0128 18:36:46.251415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8tmng" event={"ID":"384d6d65-9777-47a4-bef0-dbeeb9959e66","Type":"ContainerDied","Data":"97d3a1b0b0a6f64466783daecd9a3fd858e509354735dcc37f2df8c44538601d"} Jan 28 18:36:51 crc kubenswrapper[4749]: I0128 18:36:51.276600 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-tms2k_43900a49-0d8f-48a8-b6af-385321464445/kube-multus-additional-cni-plugins/0.log" Jan 28 18:36:51 crc kubenswrapper[4749]: I0128 18:36:51.277086 4749 generic.go:334] "Generic (PLEG): container finished" podID="43900a49-0d8f-48a8-b6af-385321464445" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" exitCode=137 Jan 28 18:36:51 crc kubenswrapper[4749]: I0128 18:36:51.277119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" event={"ID":"43900a49-0d8f-48a8-b6af-385321464445","Type":"ContainerDied","Data":"c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826"} Jan 28 18:36:54 crc kubenswrapper[4749]: I0128 18:36:54.562903 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:36:54 crc kubenswrapper[4749]: I0128 18:36:54.562989 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:36:54 crc kubenswrapper[4749]: E0128 18:36:54.640613 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826 is running failed: container process not found" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:54 crc kubenswrapper[4749]: E0128 18:36:54.641283 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826 is running failed: container process not found" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:54 crc kubenswrapper[4749]: E0128 18:36:54.641956 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826 is running failed: container process not found" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:36:54 crc kubenswrapper[4749]: E0128 18:36:54.642162 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" Jan 28 18:36:54 crc kubenswrapper[4749]: I0128 18:36:54.667937 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lbzfp" Jan 28 18:36:54 crc kubenswrapper[4749]: I0128 18:36:54.865193 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pqtw5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded" start-of-body= Jan 28 18:36:54 crc kubenswrapper[4749]: I0128 18:36:54.865700 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded" Jan 28 18:36:57 crc kubenswrapper[4749]: I0128 18:36:57.885105 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 28 18:37:00 crc kubenswrapper[4749]: I0128 18:37:00.506083 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t"] Jan 28 18:37:00 crc kubenswrapper[4749]: E0128 18:37:00.909233 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 28 18:37:00 crc kubenswrapper[4749]: E0128 18:37:00.909897 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vsqlx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-whzp4_openshift-marketplace(5239a0f8-12de-4979-af6c-d209a21bc067): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 18:37:00 crc kubenswrapper[4749]: E0128 18:37:00.911408 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-whzp4" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" Jan 28 18:37:01 crc kubenswrapper[4749]: I0128 18:37:01.392496 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.392469412 podStartE2EDuration="4.392469412s" podCreationTimestamp="2026-01-28 18:36:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:37:01.391632842 +0000 UTC m=+89.403159627" watchObservedRunningTime="2026-01-28 18:37:01.392469412 +0000 UTC m=+89.403996197" Jan 28 18:37:02 crc kubenswrapper[4749]: I0128 18:37:02.915176 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 18:37:02 crc kubenswrapper[4749]: I0128 18:37:02.915987 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:02 crc kubenswrapper[4749]: I0128 18:37:02.919528 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 28 18:37:02 crc kubenswrapper[4749]: I0128 18:37:02.920206 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 28 18:37:02 crc kubenswrapper[4749]: I0128 18:37:02.929501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 18:37:03 crc kubenswrapper[4749]: I0128 18:37:03.072267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:03 crc kubenswrapper[4749]: I0128 18:37:03.072937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:03 crc kubenswrapper[4749]: I0128 18:37:03.174481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:03 crc kubenswrapper[4749]: I0128 18:37:03.174611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:03 crc kubenswrapper[4749]: I0128 18:37:03.174645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:03 crc kubenswrapper[4749]: I0128 18:37:03.198306 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:03 crc kubenswrapper[4749]: I0128 18:37:03.289275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.216355 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-whzp4" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.258748 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.286186 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-756f5b687d-n96jl"] Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.286568 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.286599 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.286763 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.287360 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.290981 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-756f5b687d-n96jl"] Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.351079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" event={"ID":"4f3a2df0-0830-4e78-a168-31171cf06b76","Type":"ContainerDied","Data":"670ace17908b647ea2514b8b25a43bf80ad83034024842eba8b8bd6429e3b649"} Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.351171 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.389922 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-client-ca\") pod \"4f3a2df0-0830-4e78-a168-31171cf06b76\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-proxy-ca-bundles\") pod \"4f3a2df0-0830-4e78-a168-31171cf06b76\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390052 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3a2df0-0830-4e78-a168-31171cf06b76-serving-cert\") pod \"4f3a2df0-0830-4e78-a168-31171cf06b76\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk7kb\" (UniqueName: \"kubernetes.io/projected/4f3a2df0-0830-4e78-a168-31171cf06b76-kube-api-access-gk7kb\") pod \"4f3a2df0-0830-4e78-a168-31171cf06b76\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390169 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-config\") pod \"4f3a2df0-0830-4e78-a168-31171cf06b76\" (UID: \"4f3a2df0-0830-4e78-a168-31171cf06b76\") " Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390355 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c16165-8b11-45c3-94aa-c284d21e271f-serving-cert\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390384 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-proxy-ca-bundles\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390402 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-client-ca\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-config\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.390484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt595\" (UniqueName: \"kubernetes.io/projected/a9c16165-8b11-45c3-94aa-c284d21e271f-kube-api-access-gt595\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.391599 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-client-ca" (OuterVolumeSpecName: "client-ca") pod "4f3a2df0-0830-4e78-a168-31171cf06b76" (UID: "4f3a2df0-0830-4e78-a168-31171cf06b76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.392018 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4f3a2df0-0830-4e78-a168-31171cf06b76" (UID: "4f3a2df0-0830-4e78-a168-31171cf06b76"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.394718 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-config" (OuterVolumeSpecName: "config") pod "4f3a2df0-0830-4e78-a168-31171cf06b76" (UID: "4f3a2df0-0830-4e78-a168-31171cf06b76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.396443 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.396599 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzcsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tndcr_openshift-marketplace(9ec3b90d-1483-485e-ab0a-52af455cc9ea): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.397822 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tndcr" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.400881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3a2df0-0830-4e78-a168-31171cf06b76-kube-api-access-gk7kb" (OuterVolumeSpecName: "kube-api-access-gk7kb") pod "4f3a2df0-0830-4e78-a168-31171cf06b76" (UID: "4f3a2df0-0830-4e78-a168-31171cf06b76"). InnerVolumeSpecName "kube-api-access-gk7kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.411131 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3a2df0-0830-4e78-a168-31171cf06b76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4f3a2df0-0830-4e78-a168-31171cf06b76" (UID: "4f3a2df0-0830-4e78-a168-31171cf06b76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.491835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c16165-8b11-45c3-94aa-c284d21e271f-serving-cert\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-proxy-ca-bundles\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492293 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-client-ca\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-config\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492433 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt595\" (UniqueName: \"kubernetes.io/projected/a9c16165-8b11-45c3-94aa-c284d21e271f-kube-api-access-gt595\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492501 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492615 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3a2df0-0830-4e78-a168-31171cf06b76-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492706 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk7kb\" (UniqueName: \"kubernetes.io/projected/4f3a2df0-0830-4e78-a168-31171cf06b76-kube-api-access-gk7kb\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492778 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.492960 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f3a2df0-0830-4e78-a168-31171cf06b76-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.493627 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-client-ca\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.493793 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-config\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.504315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c16165-8b11-45c3-94aa-c284d21e271f-serving-cert\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.508401 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt595\" (UniqueName: \"kubernetes.io/projected/a9c16165-8b11-45c3-94aa-c284d21e271f-kube-api-access-gt595\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.563461 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.563527 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.640846 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826 is running failed: container process not found" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.641146 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826 is running failed: container process not found" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.641486 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826 is running failed: container process not found" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 18:37:04 crc kubenswrapper[4749]: E0128 18:37:04.641519 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.662816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-proxy-ca-bundles\") pod \"controller-manager-756f5b687d-n96jl\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.681435 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqtw5"] Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.684797 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pqtw5"] Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.865238 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pqtw5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.865307 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pqtw5" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.879468 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3a2df0-0830-4e78-a168-31171cf06b76" path="/var/lib/kubelet/pods/4f3a2df0-0830-4e78-a168-31171cf06b76/volumes" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.895881 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 18:37:04 crc kubenswrapper[4749]: I0128 18:37:04.910020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.113420 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.114593 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.129811 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.238687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e45df95-120e-4171-8e11-01988340d6aa-kube-api-access\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.238745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-var-lock\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.238790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.340294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.340428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.340441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e45df95-120e-4171-8e11-01988340d6aa-kube-api-access\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.340510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-var-lock\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.340686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-var-lock\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.359666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e45df95-120e-4171-8e11-01988340d6aa-kube-api-access\") pod \"installer-9-crc\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.444041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:08 crc kubenswrapper[4749]: E0128 18:37:08.736534 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tndcr" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.747810 4749 scope.go:117] "RemoveContainer" containerID="5573b792054a4fc452aea653e05c6446b98b91b16923cd5446799b23ffc251b6" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.790776 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-tms2k_43900a49-0d8f-48a8-b6af-385321464445/kube-multus-additional-cni-plugins/0.log" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.790893 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.950023 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist\") pod \"43900a49-0d8f-48a8-b6af-385321464445\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.950110 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/43900a49-0d8f-48a8-b6af-385321464445-ready\") pod \"43900a49-0d8f-48a8-b6af-385321464445\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.950198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf7h8\" (UniqueName: \"kubernetes.io/projected/43900a49-0d8f-48a8-b6af-385321464445-kube-api-access-bf7h8\") pod \"43900a49-0d8f-48a8-b6af-385321464445\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.950226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43900a49-0d8f-48a8-b6af-385321464445-tuning-conf-dir\") pod \"43900a49-0d8f-48a8-b6af-385321464445\" (UID: \"43900a49-0d8f-48a8-b6af-385321464445\") " Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.950417 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43900a49-0d8f-48a8-b6af-385321464445-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "43900a49-0d8f-48a8-b6af-385321464445" (UID: "43900a49-0d8f-48a8-b6af-385321464445"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.950905 4749 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43900a49-0d8f-48a8-b6af-385321464445-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.951046 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "43900a49-0d8f-48a8-b6af-385321464445" (UID: "43900a49-0d8f-48a8-b6af-385321464445"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.951541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43900a49-0d8f-48a8-b6af-385321464445-ready" (OuterVolumeSpecName: "ready") pod "43900a49-0d8f-48a8-b6af-385321464445" (UID: "43900a49-0d8f-48a8-b6af-385321464445"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:37:08 crc kubenswrapper[4749]: I0128 18:37:08.955914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43900a49-0d8f-48a8-b6af-385321464445-kube-api-access-bf7h8" (OuterVolumeSpecName: "kube-api-access-bf7h8") pod "43900a49-0d8f-48a8-b6af-385321464445" (UID: "43900a49-0d8f-48a8-b6af-385321464445"). InnerVolumeSpecName "kube-api-access-bf7h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:37:09 crc kubenswrapper[4749]: I0128 18:37:09.052357 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43900a49-0d8f-48a8-b6af-385321464445-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:09 crc kubenswrapper[4749]: I0128 18:37:09.052408 4749 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/43900a49-0d8f-48a8-b6af-385321464445-ready\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:09 crc kubenswrapper[4749]: I0128 18:37:09.052418 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf7h8\" (UniqueName: \"kubernetes.io/projected/43900a49-0d8f-48a8-b6af-385321464445-kube-api-access-bf7h8\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:09 crc kubenswrapper[4749]: I0128 18:37:09.377458 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-tms2k_43900a49-0d8f-48a8-b6af-385321464445/kube-multus-additional-cni-plugins/0.log" Jan 28 18:37:09 crc kubenswrapper[4749]: I0128 18:37:09.377535 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" event={"ID":"43900a49-0d8f-48a8-b6af-385321464445","Type":"ContainerDied","Data":"15f563570ee42693245cfffb52b00a29355508cc34073d5a4bcbb1ae3f50ea78"} Jan 28 18:37:09 crc kubenswrapper[4749]: I0128 18:37:09.377555 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-tms2k" Jan 28 18:37:09 crc kubenswrapper[4749]: I0128 18:37:09.402401 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-tms2k"] Jan 28 18:37:09 crc kubenswrapper[4749]: I0128 18:37:09.403402 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-tms2k"] Jan 28 18:37:09 crc kubenswrapper[4749]: E0128 18:37:09.673715 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 28 18:37:09 crc kubenswrapper[4749]: E0128 18:37:09.674035 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jqxqx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4ngh4_openshift-marketplace(b76a4267-3557-4246-b3dc-84a610d9fbd4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 18:37:09 crc kubenswrapper[4749]: E0128 18:37:09.675258 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4ngh4" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" Jan 28 18:37:10 crc kubenswrapper[4749]: I0128 18:37:10.878365 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43900a49-0d8f-48a8-b6af-385321464445" path="/var/lib/kubelet/pods/43900a49-0d8f-48a8-b6af-385321464445/volumes" Jan 28 18:37:11 crc kubenswrapper[4749]: E0128 18:37:11.038397 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4ngh4" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" Jan 28 18:37:11 crc kubenswrapper[4749]: E0128 18:37:11.150089 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 28 18:37:11 crc kubenswrapper[4749]: E0128 18:37:11.150565 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n9fdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jbrpd_openshift-marketplace(c03b812d-6833-4c65-887b-0fa0a6c1227a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 18:37:11 crc kubenswrapper[4749]: E0128 18:37:11.151705 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jbrpd" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.018110 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jbrpd" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.061979 4749 scope.go:117] "RemoveContainer" containerID="c08c8c38c7906218d74d769d82d1edd0961ce4ebe4c64ec23d85d31d5b229826" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.112900 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.113724 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfktk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mk2w9_openshift-marketplace(ebc65c41-25f8-4ee9-9993-a00101a35397): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.115559 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mk2w9" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.160617 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.161145 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zl9kb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-25wll_openshift-marketplace(3dcb2d5f-2613-4277-bba9-89e404c7832a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.162887 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-25wll" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.251721 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.251860 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2dl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rhv8w_openshift-marketplace(ffc6bd30-5803-4a01-a711-4f3b3c718750): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.253221 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rhv8w" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.301516 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t"] Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.349500 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.376288 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.377035 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hn59z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cn95r_openshift-marketplace(01642193-d926-44c5-908a-98716476032b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.378463 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cn95r" podUID="01642193-d926-44c5-908a-98716476032b" Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.407668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce","Type":"ContainerStarted","Data":"6c56c53250b293858f796afaa7af315b370abde7213b9d302adb17ea4abd70af"} Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.408874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" event={"ID":"73d30b2a-69ba-4c36-b67c-dad48a315d96","Type":"ContainerStarted","Data":"80b7d64643750be11d5bb6207e6f643af72f2d95248d1fa0763f04cc055a0d7a"} Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.412554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8tmng" event={"ID":"384d6d65-9777-47a4-bef0-dbeeb9959e66","Type":"ContainerStarted","Data":"8782055332fb4b40daf7ccd681802e3d1870ce70167fb790bf463a1d29df4f70"} Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.413894 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.415786 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.415832 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.424621 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mk2w9" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.425081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rhv8w" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.425153 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cn95r" podUID="01642193-d926-44c5-908a-98716476032b" Jan 28 18:37:14 crc kubenswrapper[4749]: E0128 18:37:14.425190 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-25wll" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.562847 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.562928 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.563201 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.563248 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.615548 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 18:37:14 crc kubenswrapper[4749]: I0128 18:37:14.620501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-756f5b687d-n96jl"] Jan 28 18:37:14 crc kubenswrapper[4749]: W0128 18:37:14.623555 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7e45df95_120e_4171_8e11_01988340d6aa.slice/crio-f6aada7bc62fd480d1f750be1f0b6abc3bff587dbb7943ec7a94337b177da0bd WatchSource:0}: Error finding container f6aada7bc62fd480d1f750be1f0b6abc3bff587dbb7943ec7a94337b177da0bd: Status 404 returned error can't find the container with id f6aada7bc62fd480d1f750be1f0b6abc3bff587dbb7943ec7a94337b177da0bd Jan 28 18:37:14 crc kubenswrapper[4749]: W0128 18:37:14.623959 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c16165_8b11_45c3_94aa_c284d21e271f.slice/crio-ee520f31e33787eb7d21b7e4c6c93405d3dd059da41f407dfb658da92dfda2a4 WatchSource:0}: Error finding container ee520f31e33787eb7d21b7e4c6c93405d3dd059da41f407dfb658da92dfda2a4: Status 404 returned error can't find the container with id ee520f31e33787eb7d21b7e4c6c93405d3dd059da41f407dfb658da92dfda2a4 Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.426445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" event={"ID":"73d30b2a-69ba-4c36-b67c-dad48a315d96","Type":"ContainerStarted","Data":"85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f"} Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.427126 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.426560 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" podUID="73d30b2a-69ba-4c36-b67c-dad48a315d96" containerName="route-controller-manager" containerID="cri-o://85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f" gracePeriod=30 Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.431085 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.431567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" event={"ID":"a9c16165-8b11-45c3-94aa-c284d21e271f","Type":"ContainerStarted","Data":"a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe"} Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.431601 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" event={"ID":"a9c16165-8b11-45c3-94aa-c284d21e271f","Type":"ContainerStarted","Data":"ee520f31e33787eb7d21b7e4c6c93405d3dd059da41f407dfb658da92dfda2a4"} Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.432665 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.435282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7e45df95-120e-4171-8e11-01988340d6aa","Type":"ContainerStarted","Data":"dfdded196cca90d6ff8b284a123c5f701acd6e6e04ec96621f5fb1f55a9ee2d2"} Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.435310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7e45df95-120e-4171-8e11-01988340d6aa","Type":"ContainerStarted","Data":"f6aada7bc62fd480d1f750be1f0b6abc3bff587dbb7943ec7a94337b177da0bd"} Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.437701 4749 generic.go:334] "Generic (PLEG): container finished" podID="79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce" containerID="64e0106d6a08d7ac0ef710c488f7eb821cf16c729fc1786002867abbb9856fda" exitCode=0 Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.437785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce","Type":"ContainerDied","Data":"64e0106d6a08d7ac0ef710c488f7eb821cf16c729fc1786002867abbb9856fda"} Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.438189 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.438222 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.439871 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.450170 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" podStartSLOduration=35.450152356 podStartE2EDuration="35.450152356s" podCreationTimestamp="2026-01-28 18:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:37:15.447231154 +0000 UTC m=+103.458757929" watchObservedRunningTime="2026-01-28 18:37:15.450152356 +0000 UTC m=+103.461679141" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.510342 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.5103090550000005 podStartE2EDuration="7.510309055s" podCreationTimestamp="2026-01-28 18:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:37:15.508214924 +0000 UTC m=+103.519741699" watchObservedRunningTime="2026-01-28 18:37:15.510309055 +0000 UTC m=+103.521835830" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.528386 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" podStartSLOduration=15.52836849 podStartE2EDuration="15.52836849s" podCreationTimestamp="2026-01-28 18:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:37:15.524031413 +0000 UTC m=+103.535558208" watchObservedRunningTime="2026-01-28 18:37:15.52836849 +0000 UTC m=+103.539895265" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.768220 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.798285 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9"] Jan 28 18:37:15 crc kubenswrapper[4749]: E0128 18:37:15.798912 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.798925 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" Jan 28 18:37:15 crc kubenswrapper[4749]: E0128 18:37:15.798944 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d30b2a-69ba-4c36-b67c-dad48a315d96" containerName="route-controller-manager" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.798951 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d30b2a-69ba-4c36-b67c-dad48a315d96" containerName="route-controller-manager" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.799042 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d30b2a-69ba-4c36-b67c-dad48a315d96" containerName="route-controller-manager" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.799056 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="43900a49-0d8f-48a8-b6af-385321464445" containerName="kube-multus-additional-cni-plugins" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.799438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.803158 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9"] Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.948699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-client-ca\") pod \"73d30b2a-69ba-4c36-b67c-dad48a315d96\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.948798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-config\") pod \"73d30b2a-69ba-4c36-b67c-dad48a315d96\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.948826 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d30b2a-69ba-4c36-b67c-dad48a315d96-serving-cert\") pod \"73d30b2a-69ba-4c36-b67c-dad48a315d96\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.948861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvfcz\" (UniqueName: \"kubernetes.io/projected/73d30b2a-69ba-4c36-b67c-dad48a315d96-kube-api-access-wvfcz\") pod \"73d30b2a-69ba-4c36-b67c-dad48a315d96\" (UID: \"73d30b2a-69ba-4c36-b67c-dad48a315d96\") " Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.949009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-config\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.949037 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58752ef3-6725-4333-ae18-21909c443ff2-serving-cert\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.949062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d77v\" (UniqueName: \"kubernetes.io/projected/58752ef3-6725-4333-ae18-21909c443ff2-kube-api-access-6d77v\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.949085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-client-ca\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.949676 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-client-ca" (OuterVolumeSpecName: "client-ca") pod "73d30b2a-69ba-4c36-b67c-dad48a315d96" (UID: "73d30b2a-69ba-4c36-b67c-dad48a315d96"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.949979 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-config" (OuterVolumeSpecName: "config") pod "73d30b2a-69ba-4c36-b67c-dad48a315d96" (UID: "73d30b2a-69ba-4c36-b67c-dad48a315d96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.955513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d30b2a-69ba-4c36-b67c-dad48a315d96-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "73d30b2a-69ba-4c36-b67c-dad48a315d96" (UID: "73d30b2a-69ba-4c36-b67c-dad48a315d96"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:15 crc kubenswrapper[4749]: I0128 18:37:15.956219 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d30b2a-69ba-4c36-b67c-dad48a315d96-kube-api-access-wvfcz" (OuterVolumeSpecName: "kube-api-access-wvfcz") pod "73d30b2a-69ba-4c36-b67c-dad48a315d96" (UID: "73d30b2a-69ba-4c36-b67c-dad48a315d96"). InnerVolumeSpecName "kube-api-access-wvfcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.050149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-client-ca\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.050243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-config\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.050272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58752ef3-6725-4333-ae18-21909c443ff2-serving-cert\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.050302 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d77v\" (UniqueName: \"kubernetes.io/projected/58752ef3-6725-4333-ae18-21909c443ff2-kube-api-access-6d77v\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.050358 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.050368 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d30b2a-69ba-4c36-b67c-dad48a315d96-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.050377 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvfcz\" (UniqueName: \"kubernetes.io/projected/73d30b2a-69ba-4c36-b67c-dad48a315d96-kube-api-access-wvfcz\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.050387 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73d30b2a-69ba-4c36-b67c-dad48a315d96-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.052080 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-client-ca\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.052546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-config\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.055135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58752ef3-6725-4333-ae18-21909c443ff2-serving-cert\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.066265 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d77v\" (UniqueName: \"kubernetes.io/projected/58752ef3-6725-4333-ae18-21909c443ff2-kube-api-access-6d77v\") pod \"route-controller-manager-6885bdfc9f-6z6z9\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.117221 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.300413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9"] Jan 28 18:37:16 crc kubenswrapper[4749]: W0128 18:37:16.306832 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58752ef3_6725_4333_ae18_21909c443ff2.slice/crio-c57ea30a8ceabaef5e3e7f7e0e0975917ffb5cec6df2b21c2c85d3257163e0b4 WatchSource:0}: Error finding container c57ea30a8ceabaef5e3e7f7e0e0975917ffb5cec6df2b21c2c85d3257163e0b4: Status 404 returned error can't find the container with id c57ea30a8ceabaef5e3e7f7e0e0975917ffb5cec6df2b21c2c85d3257163e0b4 Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.447025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" event={"ID":"58752ef3-6725-4333-ae18-21909c443ff2","Type":"ContainerStarted","Data":"2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df"} Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.447346 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.447363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" event={"ID":"58752ef3-6725-4333-ae18-21909c443ff2","Type":"ContainerStarted","Data":"c57ea30a8ceabaef5e3e7f7e0e0975917ffb5cec6df2b21c2c85d3257163e0b4"} Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.448830 4749 generic.go:334] "Generic (PLEG): container finished" podID="73d30b2a-69ba-4c36-b67c-dad48a315d96" containerID="85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f" exitCode=0 Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.448891 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.448946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" event={"ID":"73d30b2a-69ba-4c36-b67c-dad48a315d96","Type":"ContainerDied","Data":"85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f"} Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.448971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t" event={"ID":"73d30b2a-69ba-4c36-b67c-dad48a315d96","Type":"ContainerDied","Data":"80b7d64643750be11d5bb6207e6f643af72f2d95248d1fa0763f04cc055a0d7a"} Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.448960 4749 patch_prober.go:28] interesting pod/route-controller-manager-6885bdfc9f-6z6z9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.449037 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" podUID="58752ef3-6725-4333-ae18-21909c443ff2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.449364 4749 scope.go:117] "RemoveContainer" containerID="85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.450968 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8tmng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.450998 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8tmng" podUID="384d6d65-9777-47a4-bef0-dbeeb9959e66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.464276 4749 scope.go:117] "RemoveContainer" containerID="85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f" Jan 28 18:37:16 crc kubenswrapper[4749]: E0128 18:37:16.465365 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f\": container with ID starting with 85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f not found: ID does not exist" containerID="85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.466936 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f"} err="failed to get container status \"85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f\": rpc error: code = NotFound desc = could not find container \"85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f\": container with ID starting with 85aaf281d970f06ddd154ccf3aba32994d1a8e5574be1a82bf2a75bf4055d54f not found: ID does not exist" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.478659 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" podStartSLOduration=16.478643737 podStartE2EDuration="16.478643737s" podCreationTimestamp="2026-01-28 18:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:37:16.467602455 +0000 UTC m=+104.479129250" watchObservedRunningTime="2026-01-28 18:37:16.478643737 +0000 UTC m=+104.490170512" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.481506 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t"] Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.485700 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-wwx9t"] Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.651214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.760871 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kubelet-dir\") pod \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\" (UID: \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\") " Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.760973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kube-api-access\") pod \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\" (UID: \"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce\") " Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.760985 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce" (UID: "79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.761317 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.766526 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce" (UID: "79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.863011 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:16 crc kubenswrapper[4749]: I0128 18:37:16.889512 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d30b2a-69ba-4c36-b67c-dad48a315d96" path="/var/lib/kubelet/pods/73d30b2a-69ba-4c36-b67c-dad48a315d96/volumes" Jan 28 18:37:17 crc kubenswrapper[4749]: I0128 18:37:17.465546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce","Type":"ContainerDied","Data":"6c56c53250b293858f796afaa7af315b370abde7213b9d302adb17ea4abd70af"} Jan 28 18:37:17 crc kubenswrapper[4749]: I0128 18:37:17.465591 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c56c53250b293858f796afaa7af315b370abde7213b9d302adb17ea4abd70af" Jan 28 18:37:17 crc kubenswrapper[4749]: I0128 18:37:17.465602 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 18:37:17 crc kubenswrapper[4749]: I0128 18:37:17.471187 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:18 crc kubenswrapper[4749]: I0128 18:37:18.582303 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9dfj"] Jan 28 18:37:24 crc kubenswrapper[4749]: I0128 18:37:24.568086 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8tmng" Jan 28 18:37:40 crc kubenswrapper[4749]: I0128 18:37:40.496041 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-756f5b687d-n96jl"] Jan 28 18:37:40 crc kubenswrapper[4749]: I0128 18:37:40.496829 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" podUID="a9c16165-8b11-45c3-94aa-c284d21e271f" containerName="controller-manager" containerID="cri-o://a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe" gracePeriod=30 Jan 28 18:37:40 crc kubenswrapper[4749]: I0128 18:37:40.598521 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9"] Jan 28 18:37:40 crc kubenswrapper[4749]: I0128 18:37:40.598787 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" podUID="58752ef3-6725-4333-ae18-21909c443ff2" containerName="route-controller-manager" containerID="cri-o://2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df" gracePeriod=30 Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.284431 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.433174 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.469970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58752ef3-6725-4333-ae18-21909c443ff2-serving-cert\") pod \"58752ef3-6725-4333-ae18-21909c443ff2\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.470237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d77v\" (UniqueName: \"kubernetes.io/projected/58752ef3-6725-4333-ae18-21909c443ff2-kube-api-access-6d77v\") pod \"58752ef3-6725-4333-ae18-21909c443ff2\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.470421 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-config\") pod \"58752ef3-6725-4333-ae18-21909c443ff2\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.470495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-client-ca\") pod \"58752ef3-6725-4333-ae18-21909c443ff2\" (UID: \"58752ef3-6725-4333-ae18-21909c443ff2\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.471474 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-client-ca" (OuterVolumeSpecName: "client-ca") pod "58752ef3-6725-4333-ae18-21909c443ff2" (UID: "58752ef3-6725-4333-ae18-21909c443ff2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.474068 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-config" (OuterVolumeSpecName: "config") pod "58752ef3-6725-4333-ae18-21909c443ff2" (UID: "58752ef3-6725-4333-ae18-21909c443ff2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.476924 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58752ef3-6725-4333-ae18-21909c443ff2-kube-api-access-6d77v" (OuterVolumeSpecName: "kube-api-access-6d77v") pod "58752ef3-6725-4333-ae18-21909c443ff2" (UID: "58752ef3-6725-4333-ae18-21909c443ff2"). InnerVolumeSpecName "kube-api-access-6d77v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.483069 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58752ef3-6725-4333-ae18-21909c443ff2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58752ef3-6725-4333-ae18-21909c443ff2" (UID: "58752ef3-6725-4333-ae18-21909c443ff2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572061 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-client-ca\") pod \"a9c16165-8b11-45c3-94aa-c284d21e271f\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572172 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c16165-8b11-45c3-94aa-c284d21e271f-serving-cert\") pod \"a9c16165-8b11-45c3-94aa-c284d21e271f\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-config\") pod \"a9c16165-8b11-45c3-94aa-c284d21e271f\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-proxy-ca-bundles\") pod \"a9c16165-8b11-45c3-94aa-c284d21e271f\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572353 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt595\" (UniqueName: \"kubernetes.io/projected/a9c16165-8b11-45c3-94aa-c284d21e271f-kube-api-access-gt595\") pod \"a9c16165-8b11-45c3-94aa-c284d21e271f\" (UID: \"a9c16165-8b11-45c3-94aa-c284d21e271f\") " Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572591 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d77v\" (UniqueName: \"kubernetes.io/projected/58752ef3-6725-4333-ae18-21909c443ff2-kube-api-access-6d77v\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572606 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572617 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58752ef3-6725-4333-ae18-21909c443ff2-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.572625 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58752ef3-6725-4333-ae18-21909c443ff2-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.573930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9c16165-8b11-45c3-94aa-c284d21e271f" (UID: "a9c16165-8b11-45c3-94aa-c284d21e271f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.573964 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-config" (OuterVolumeSpecName: "config") pod "a9c16165-8b11-45c3-94aa-c284d21e271f" (UID: "a9c16165-8b11-45c3-94aa-c284d21e271f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.574192 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9c16165-8b11-45c3-94aa-c284d21e271f" (UID: "a9c16165-8b11-45c3-94aa-c284d21e271f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.576477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c16165-8b11-45c3-94aa-c284d21e271f-kube-api-access-gt595" (OuterVolumeSpecName: "kube-api-access-gt595") pod "a9c16165-8b11-45c3-94aa-c284d21e271f" (UID: "a9c16165-8b11-45c3-94aa-c284d21e271f"). InnerVolumeSpecName "kube-api-access-gt595". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.577483 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c16165-8b11-45c3-94aa-c284d21e271f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9c16165-8b11-45c3-94aa-c284d21e271f" (UID: "a9c16165-8b11-45c3-94aa-c284d21e271f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.607020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn95r" event={"ID":"01642193-d926-44c5-908a-98716476032b","Type":"ContainerStarted","Data":"ff94875b24f1eb34cb791a00e03c8e05e0f25c0e0fab94524514adf241bb16d0"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.609453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbrpd" event={"ID":"c03b812d-6833-4c65-887b-0fa0a6c1227a","Type":"ContainerStarted","Data":"b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.611690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ngh4" event={"ID":"b76a4267-3557-4246-b3dc-84a610d9fbd4","Type":"ContainerStarted","Data":"5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.613373 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerID="5d6a6f9b707d6baef2f8a4063cc8975e39135845eb3d6c7b5863d2297d751b2b" exitCode=0 Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.613428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tndcr" event={"ID":"9ec3b90d-1483-485e-ab0a-52af455cc9ea","Type":"ContainerDied","Data":"5d6a6f9b707d6baef2f8a4063cc8975e39135845eb3d6c7b5863d2297d751b2b"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.616781 4749 generic.go:334] "Generic (PLEG): container finished" podID="5239a0f8-12de-4979-af6c-d209a21bc067" containerID="4e140e7e41fbcd5b78bfd83fc4f3acf53a4239095e7ace2dc5db60d92c073291" exitCode=0 Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.616841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whzp4" event={"ID":"5239a0f8-12de-4979-af6c-d209a21bc067","Type":"ContainerDied","Data":"4e140e7e41fbcd5b78bfd83fc4f3acf53a4239095e7ace2dc5db60d92c073291"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.619666 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9c16165-8b11-45c3-94aa-c284d21e271f" containerID="a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe" exitCode=0 Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.619734 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.619750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" event={"ID":"a9c16165-8b11-45c3-94aa-c284d21e271f","Type":"ContainerDied","Data":"a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.619959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-756f5b687d-n96jl" event={"ID":"a9c16165-8b11-45c3-94aa-c284d21e271f","Type":"ContainerDied","Data":"ee520f31e33787eb7d21b7e4c6c93405d3dd059da41f407dfb658da92dfda2a4"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.619999 4749 scope.go:117] "RemoveContainer" containerID="a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.624002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhv8w" event={"ID":"ffc6bd30-5803-4a01-a711-4f3b3c718750","Type":"ContainerStarted","Data":"69b5de84c695befd06dc04d0d91fe0134a8a79a90179f81d140205312ecf4fe0"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.640046 4749 generic.go:334] "Generic (PLEG): container finished" podID="58752ef3-6725-4333-ae18-21909c443ff2" containerID="2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df" exitCode=0 Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.640164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" event={"ID":"58752ef3-6725-4333-ae18-21909c443ff2","Type":"ContainerDied","Data":"2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.640199 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.640207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9" event={"ID":"58752ef3-6725-4333-ae18-21909c443ff2","Type":"ContainerDied","Data":"c57ea30a8ceabaef5e3e7f7e0e0975917ffb5cec6df2b21c2c85d3257163e0b4"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.648217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2w9" event={"ID":"ebc65c41-25f8-4ee9-9993-a00101a35397","Type":"ContainerStarted","Data":"b79ff7c2c69acb044f8bf97cfaa44ff922f20495b4d885ab05a85c8d54786f3c"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.652769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25wll" event={"ID":"3dcb2d5f-2613-4277-bba9-89e404c7832a","Type":"ContainerStarted","Data":"a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c"} Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.674149 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c16165-8b11-45c3-94aa-c284d21e271f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.674182 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.674192 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.674204 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt595\" (UniqueName: \"kubernetes.io/projected/a9c16165-8b11-45c3-94aa-c284d21e271f-kube-api-access-gt595\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.674213 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9c16165-8b11-45c3-94aa-c284d21e271f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.949787 4749 scope.go:117] "RemoveContainer" containerID="a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe" Jan 28 18:37:41 crc kubenswrapper[4749]: E0128 18:37:41.950225 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe\": container with ID starting with a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe not found: ID does not exist" containerID="a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.950344 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe"} err="failed to get container status \"a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe\": rpc error: code = NotFound desc = could not find container \"a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe\": container with ID starting with a834c51974ccc7f7dfb4f1597e5715fb1edac442be80ddf9242a1c040014b0fe not found: ID does not exist" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.950372 4749 scope.go:117] "RemoveContainer" containerID="2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.961698 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-756f5b687d-n96jl"] Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.964971 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-756f5b687d-n96jl"] Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.967018 4749 scope.go:117] "RemoveContainer" containerID="2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df" Jan 28 18:37:41 crc kubenswrapper[4749]: E0128 18:37:41.967499 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df\": container with ID starting with 2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df not found: ID does not exist" containerID="2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.967557 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df"} err="failed to get container status \"2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df\": rpc error: code = NotFound desc = could not find container \"2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df\": container with ID starting with 2f9c1acb5e43f257e22a0d41b15e6628ee9621f4deabec40840f9711149d36df not found: ID does not exist" Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.972930 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9"] Jan 28 18:37:41 crc kubenswrapper[4749]: I0128 18:37:41.976034 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6885bdfc9f-6z6z9"] Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.098979 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm"] Jan 28 18:37:42 crc kubenswrapper[4749]: E0128 18:37:42.099194 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c16165-8b11-45c3-94aa-c284d21e271f" containerName="controller-manager" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.099213 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c16165-8b11-45c3-94aa-c284d21e271f" containerName="controller-manager" Jan 28 18:37:42 crc kubenswrapper[4749]: E0128 18:37:42.099227 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce" containerName="pruner" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.099233 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce" containerName="pruner" Jan 28 18:37:42 crc kubenswrapper[4749]: E0128 18:37:42.099242 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58752ef3-6725-4333-ae18-21909c443ff2" containerName="route-controller-manager" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.099248 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="58752ef3-6725-4333-ae18-21909c443ff2" containerName="route-controller-manager" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.099363 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="58752ef3-6725-4333-ae18-21909c443ff2" containerName="route-controller-manager" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.099374 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e5d80a-6ec2-4e6e-9e9a-aef56e2c85ce" containerName="pruner" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.099383 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c16165-8b11-45c3-94aa-c284d21e271f" containerName="controller-manager" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.099819 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.101537 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc"] Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.102157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.102861 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.104786 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.104964 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.105087 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.105184 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.105305 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.105745 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.105892 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.106004 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.106618 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc"] Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.109237 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm"] Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.109718 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.109930 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.110171 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.112674 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.284919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-config\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.284981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-config\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.285010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-serving-cert\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.285039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-proxy-ca-bundles\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.285064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-985lj\" (UniqueName: \"kubernetes.io/projected/71a4037e-72f6-49aa-9853-e890e627e4e9-kube-api-access-985lj\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.285099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71a4037e-72f6-49aa-9853-e890e627e4e9-serving-cert\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.285133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-client-ca\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.285156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-client-ca\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.285188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh8gh\" (UniqueName: \"kubernetes.io/projected/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-kube-api-access-rh8gh\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386799 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-config\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386842 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-config\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-serving-cert\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-proxy-ca-bundles\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-985lj\" (UniqueName: \"kubernetes.io/projected/71a4037e-72f6-49aa-9853-e890e627e4e9-kube-api-access-985lj\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71a4037e-72f6-49aa-9853-e890e627e4e9-serving-cert\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-client-ca\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386959 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-client-ca\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.386979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh8gh\" (UniqueName: \"kubernetes.io/projected/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-kube-api-access-rh8gh\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.388110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-client-ca\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.388469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-client-ca\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.388477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-proxy-ca-bundles\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.388735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-config\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.388869 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-config\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.391005 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71a4037e-72f6-49aa-9853-e890e627e4e9-serving-cert\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.396415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-serving-cert\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.402034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-985lj\" (UniqueName: \"kubernetes.io/projected/71a4037e-72f6-49aa-9853-e890e627e4e9-kube-api-access-985lj\") pod \"controller-manager-9dcf4c7fd-22dcm\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.402304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh8gh\" (UniqueName: \"kubernetes.io/projected/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-kube-api-access-rh8gh\") pod \"route-controller-manager-5b4dcf45dc-w8llc\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.566813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.572185 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.663033 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerID="a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c" exitCode=0 Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.663220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25wll" event={"ID":"3dcb2d5f-2613-4277-bba9-89e404c7832a","Type":"ContainerDied","Data":"a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c"} Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.666764 4749 generic.go:334] "Generic (PLEG): container finished" podID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerID="b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593" exitCode=0 Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.666824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbrpd" event={"ID":"c03b812d-6833-4c65-887b-0fa0a6c1227a","Type":"ContainerDied","Data":"b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593"} Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.673251 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whzp4" event={"ID":"5239a0f8-12de-4979-af6c-d209a21bc067","Type":"ContainerStarted","Data":"f1b91af5baceab106fd6e75209fa95e07c1de01ae7622838db2b1c30dbde2520"} Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.675617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tndcr" event={"ID":"9ec3b90d-1483-485e-ab0a-52af455cc9ea","Type":"ContainerStarted","Data":"9d37f94c2dc698e57f6a3495021472c11f87b6465c6bb6af16208d4abc76c63a"} Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.678449 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerID="69b5de84c695befd06dc04d0d91fe0134a8a79a90179f81d140205312ecf4fe0" exitCode=0 Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.678508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhv8w" event={"ID":"ffc6bd30-5803-4a01-a711-4f3b3c718750","Type":"ContainerDied","Data":"69b5de84c695befd06dc04d0d91fe0134a8a79a90179f81d140205312ecf4fe0"} Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.684507 4749 generic.go:334] "Generic (PLEG): container finished" podID="01642193-d926-44c5-908a-98716476032b" containerID="ff94875b24f1eb34cb791a00e03c8e05e0f25c0e0fab94524514adf241bb16d0" exitCode=0 Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.684573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn95r" event={"ID":"01642193-d926-44c5-908a-98716476032b","Type":"ContainerDied","Data":"ff94875b24f1eb34cb791a00e03c8e05e0f25c0e0fab94524514adf241bb16d0"} Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.687472 4749 generic.go:334] "Generic (PLEG): container finished" podID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerID="5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f" exitCode=0 Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.687531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ngh4" event={"ID":"b76a4267-3557-4246-b3dc-84a610d9fbd4","Type":"ContainerDied","Data":"5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f"} Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.694610 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerID="b79ff7c2c69acb044f8bf97cfaa44ff922f20495b4d885ab05a85c8d54786f3c" exitCode=0 Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.694643 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2w9" event={"ID":"ebc65c41-25f8-4ee9-9993-a00101a35397","Type":"ContainerDied","Data":"b79ff7c2c69acb044f8bf97cfaa44ff922f20495b4d885ab05a85c8d54786f3c"} Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.705381 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tndcr" podStartSLOduration=3.579043212 podStartE2EDuration="1m18.705363004s" podCreationTimestamp="2026-01-28 18:36:24 +0000 UTC" firstStartedPulling="2026-01-28 18:36:27.002857785 +0000 UTC m=+55.014384560" lastFinishedPulling="2026-01-28 18:37:42.129177577 +0000 UTC m=+130.140704352" observedRunningTime="2026-01-28 18:37:42.702396892 +0000 UTC m=+130.713923687" watchObservedRunningTime="2026-01-28 18:37:42.705363004 +0000 UTC m=+130.716889779" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.742742 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-whzp4" podStartSLOduration=2.642516964 podStartE2EDuration="1m18.742725434s" podCreationTimestamp="2026-01-28 18:36:24 +0000 UTC" firstStartedPulling="2026-01-28 18:36:25.935518509 +0000 UTC m=+53.947045284" lastFinishedPulling="2026-01-28 18:37:42.035726979 +0000 UTC m=+130.047253754" observedRunningTime="2026-01-28 18:37:42.738631683 +0000 UTC m=+130.750158468" watchObservedRunningTime="2026-01-28 18:37:42.742725434 +0000 UTC m=+130.754252209" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.877167 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58752ef3-6725-4333-ae18-21909c443ff2" path="/var/lib/kubelet/pods/58752ef3-6725-4333-ae18-21909c443ff2/volumes" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.877686 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c16165-8b11-45c3-94aa-c284d21e271f" path="/var/lib/kubelet/pods/a9c16165-8b11-45c3-94aa-c284d21e271f/volumes" Jan 28 18:37:42 crc kubenswrapper[4749]: I0128 18:37:42.981377 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm"] Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.042902 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc"] Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.616865 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" podUID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" containerName="oauth-openshift" containerID="cri-o://59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d" gracePeriod=15 Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.708641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" event={"ID":"71a4037e-72f6-49aa-9853-e890e627e4e9","Type":"ContainerStarted","Data":"86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8"} Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.709094 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" event={"ID":"71a4037e-72f6-49aa-9853-e890e627e4e9","Type":"ContainerStarted","Data":"632673ba54a2901a03d46d203eded79606676d04ee6b0647c34b2f888ddcca5d"} Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.709111 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.711558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" event={"ID":"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531","Type":"ContainerStarted","Data":"5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919"} Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.711623 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" event={"ID":"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531","Type":"ContainerStarted","Data":"de3dde4a5c8375460bb1e471dff84dce528fa1db2ef9b4d17aa43e0b48d7e70f"} Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.711648 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.714413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhv8w" event={"ID":"ffc6bd30-5803-4a01-a711-4f3b3c718750","Type":"ContainerStarted","Data":"776f8e0882a32def36f1f68153ebfae7c3a09c93279505d95a0a53492c539a9c"} Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.716302 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.721180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn95r" event={"ID":"01642193-d926-44c5-908a-98716476032b","Type":"ContainerStarted","Data":"bb8374f38945f151cdc5a4371456d1f24dc95f8ddee532dd2a1d88f16b432560"} Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.724143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbrpd" event={"ID":"c03b812d-6833-4c65-887b-0fa0a6c1227a","Type":"ContainerStarted","Data":"50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af"} Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.733786 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" podStartSLOduration=3.733763673 podStartE2EDuration="3.733763673s" podCreationTimestamp="2026-01-28 18:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:37:43.729857917 +0000 UTC m=+131.741384712" watchObservedRunningTime="2026-01-28 18:37:43.733763673 +0000 UTC m=+131.745290448" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.738701 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.762080 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rhv8w" podStartSLOduration=2.152581037 podStartE2EDuration="1m21.762062839s" podCreationTimestamp="2026-01-28 18:36:22 +0000 UTC" firstStartedPulling="2026-01-28 18:36:23.787814548 +0000 UTC m=+51.799341323" lastFinishedPulling="2026-01-28 18:37:43.39729635 +0000 UTC m=+131.408823125" observedRunningTime="2026-01-28 18:37:43.759558687 +0000 UTC m=+131.771085482" watchObservedRunningTime="2026-01-28 18:37:43.762062839 +0000 UTC m=+131.773589604" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.851026 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cn95r" podStartSLOduration=3.333452065 podStartE2EDuration="1m22.851006426s" podCreationTimestamp="2026-01-28 18:36:21 +0000 UTC" firstStartedPulling="2026-01-28 18:36:23.790357641 +0000 UTC m=+51.801884416" lastFinishedPulling="2026-01-28 18:37:43.307912002 +0000 UTC m=+131.319438777" observedRunningTime="2026-01-28 18:37:43.849858018 +0000 UTC m=+131.861384813" watchObservedRunningTime="2026-01-28 18:37:43.851006426 +0000 UTC m=+131.862533201" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.879859 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jbrpd" podStartSLOduration=3.594236275 podStartE2EDuration="1m21.879836004s" podCreationTimestamp="2026-01-28 18:36:22 +0000 UTC" firstStartedPulling="2026-01-28 18:36:24.962650325 +0000 UTC m=+52.974177100" lastFinishedPulling="2026-01-28 18:37:43.248250054 +0000 UTC m=+131.259776829" observedRunningTime="2026-01-28 18:37:43.878518712 +0000 UTC m=+131.890045497" watchObservedRunningTime="2026-01-28 18:37:43.879836004 +0000 UTC m=+131.891362779" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.890079 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r9dfj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.890135 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" podUID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Jan 28 18:37:43 crc kubenswrapper[4749]: I0128 18:37:43.908097 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" podStartSLOduration=3.908073029 podStartE2EDuration="3.908073029s" podCreationTimestamp="2026-01-28 18:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:37:43.9044464 +0000 UTC m=+131.915973175" watchObservedRunningTime="2026-01-28 18:37:43.908073029 +0000 UTC m=+131.919599804" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.170837 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.314973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-login\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8k54\" (UniqueName: \"kubernetes.io/projected/f172bf9b-7eb3-46b1-8a40-9e566b01b433-kube-api-access-s8k54\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-idp-0-file-data\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-session\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-serving-cert\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-dir\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-service-ca\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315286 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-ocp-branding-template\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315312 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-error\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-cliconfig\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-provider-selection\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315439 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-router-certs\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315467 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-trusted-ca-bundle\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.315496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-policies\") pod \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\" (UID: \"f172bf9b-7eb3-46b1-8a40-9e566b01b433\") " Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.316198 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.316489 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.316813 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.317490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.318212 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.322354 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.323059 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.323560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.324683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f172bf9b-7eb3-46b1-8a40-9e566b01b433-kube-api-access-s8k54" (OuterVolumeSpecName: "kube-api-access-s8k54") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "kube-api-access-s8k54". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.324886 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.325471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.330816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.335957 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.336840 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f172bf9b-7eb3-46b1-8a40-9e566b01b433" (UID: "f172bf9b-7eb3-46b1-8a40-9e566b01b433"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.416978 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8k54\" (UniqueName: \"kubernetes.io/projected/f172bf9b-7eb3-46b1-8a40-9e566b01b433-kube-api-access-s8k54\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417242 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417258 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417271 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417285 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417295 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417304 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417314 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417346 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417357 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417367 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417378 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417386 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f172bf9b-7eb3-46b1-8a40-9e566b01b433-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.417395 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f172bf9b-7eb3-46b1-8a40-9e566b01b433-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.453604 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.453716 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.737687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ngh4" event={"ID":"b76a4267-3557-4246-b3dc-84a610d9fbd4","Type":"ContainerStarted","Data":"63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad"} Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.739741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2w9" event={"ID":"ebc65c41-25f8-4ee9-9993-a00101a35397","Type":"ContainerStarted","Data":"5aaae7b195ab94a2bcf3385a48673994535ef87fd799f749e01fb22a8ac0b0b6"} Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.741964 4749 generic.go:334] "Generic (PLEG): container finished" podID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" containerID="59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d" exitCode=0 Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.742007 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.742043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" event={"ID":"f172bf9b-7eb3-46b1-8a40-9e566b01b433","Type":"ContainerDied","Data":"59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d"} Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.742066 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r9dfj" event={"ID":"f172bf9b-7eb3-46b1-8a40-9e566b01b433","Type":"ContainerDied","Data":"379dbeb7011d7954bd267a571e18ec86d792e1a6299518daa4ab838ea4455672"} Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.742081 4749 scope.go:117] "RemoveContainer" containerID="59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.746225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25wll" event={"ID":"3dcb2d5f-2613-4277-bba9-89e404c7832a","Type":"ContainerStarted","Data":"82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42"} Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.761402 4749 scope.go:117] "RemoveContainer" containerID="59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d" Jan 28 18:37:44 crc kubenswrapper[4749]: E0128 18:37:44.761946 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d\": container with ID starting with 59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d not found: ID does not exist" containerID="59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.761979 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d"} err="failed to get container status \"59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d\": rpc error: code = NotFound desc = could not find container \"59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d\": container with ID starting with 59863a8485651670b2743f5fb017d7d9e8d4da8b3ecf68509f69e422b020b29d not found: ID does not exist" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.767790 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4ngh4" podStartSLOduration=2.9605422839999997 podStartE2EDuration="1m22.767766098s" podCreationTimestamp="2026-01-28 18:36:22 +0000 UTC" firstStartedPulling="2026-01-28 18:36:23.779306479 +0000 UTC m=+51.790833274" lastFinishedPulling="2026-01-28 18:37:43.586530323 +0000 UTC m=+131.598057088" observedRunningTime="2026-01-28 18:37:44.764733114 +0000 UTC m=+132.776259899" watchObservedRunningTime="2026-01-28 18:37:44.767766098 +0000 UTC m=+132.779292883" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.789466 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9dfj"] Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.792906 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9dfj"] Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.813038 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-25wll" podStartSLOduration=3.386653621 podStartE2EDuration="1m19.813020262s" podCreationTimestamp="2026-01-28 18:36:25 +0000 UTC" firstStartedPulling="2026-01-28 18:36:27.018587561 +0000 UTC m=+55.030114336" lastFinishedPulling="2026-01-28 18:37:43.444954202 +0000 UTC m=+131.456480977" observedRunningTime="2026-01-28 18:37:44.812705644 +0000 UTC m=+132.824232429" watchObservedRunningTime="2026-01-28 18:37:44.813020262 +0000 UTC m=+132.824547037" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.833587 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mk2w9" podStartSLOduration=3.303716262 podStartE2EDuration="1m19.833558606s" podCreationTimestamp="2026-01-28 18:36:25 +0000 UTC" firstStartedPulling="2026-01-28 18:36:26.997296808 +0000 UTC m=+55.008823583" lastFinishedPulling="2026-01-28 18:37:43.527139152 +0000 UTC m=+131.538665927" observedRunningTime="2026-01-28 18:37:44.830881051 +0000 UTC m=+132.842407836" watchObservedRunningTime="2026-01-28 18:37:44.833558606 +0000 UTC m=+132.845085381" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.879864 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" path="/var/lib/kubelet/pods/f172bf9b-7eb3-46b1-8a40-9e566b01b433/volumes" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.880366 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:37:44 crc kubenswrapper[4749]: I0128 18:37:44.880400 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:37:45 crc kubenswrapper[4749]: I0128 18:37:45.558701 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:37:45 crc kubenswrapper[4749]: I0128 18:37:45.559806 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:37:45 crc kubenswrapper[4749]: I0128 18:37:45.769262 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-whzp4" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="registry-server" probeResult="failure" output=< Jan 28 18:37:45 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:37:45 crc kubenswrapper[4749]: > Jan 28 18:37:45 crc kubenswrapper[4749]: I0128 18:37:45.916223 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-tndcr" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="registry-server" probeResult="failure" output=< Jan 28 18:37:45 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:37:45 crc kubenswrapper[4749]: > Jan 28 18:37:45 crc kubenswrapper[4749]: I0128 18:37:45.917523 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:37:45 crc kubenswrapper[4749]: I0128 18:37:45.917571 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:37:46 crc kubenswrapper[4749]: I0128 18:37:46.601507 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mk2w9" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="registry-server" probeResult="failure" output=< Jan 28 18:37:46 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:37:46 crc kubenswrapper[4749]: > Jan 28 18:37:46 crc kubenswrapper[4749]: I0128 18:37:46.953030 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-25wll" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="registry-server" probeResult="failure" output=< Jan 28 18:37:46 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:37:46 crc kubenswrapper[4749]: > Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.109210 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9645b9d-pv2l5"] Jan 28 18:37:48 crc kubenswrapper[4749]: E0128 18:37:48.110258 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" containerName="oauth-openshift" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.110315 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" containerName="oauth-openshift" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.112788 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f172bf9b-7eb3-46b1-8a40-9e566b01b433" containerName="oauth-openshift" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.113682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.118083 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.118235 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.118369 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.118546 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.119579 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.119626 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.122364 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.122372 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.122459 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.122396 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.122828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9645b9d-pv2l5"] Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.123551 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.126881 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.128907 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.137733 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.140577 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.269767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-router-certs\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.269819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.269848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.269872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-error\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.269889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-login\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-service-ca\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b86ef46c-86cf-472f-bf58-dc0545d1ae16-audit-dir\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270120 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-audit-policies\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270136 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270279 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270384 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52j9f\" (UniqueName: \"kubernetes.io/projected/b86ef46c-86cf-472f-bf58-dc0545d1ae16-kube-api-access-52j9f\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.270411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-session\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372123 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52j9f\" (UniqueName: \"kubernetes.io/projected/b86ef46c-86cf-472f-bf58-dc0545d1ae16-kube-api-access-52j9f\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-session\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-router-certs\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372217 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-error\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372254 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-login\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-service-ca\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372297 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b86ef46c-86cf-472f-bf58-dc0545d1ae16-audit-dir\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-audit-policies\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.372477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b86ef46c-86cf-472f-bf58-dc0545d1ae16-audit-dir\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.373038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-service-ca\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.373456 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-audit-policies\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.373482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.375623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.377185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-router-certs\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.377807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.379382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.381882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-error\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.382929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.383885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.384803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-user-template-login\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.387177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52j9f\" (UniqueName: \"kubernetes.io/projected/b86ef46c-86cf-472f-bf58-dc0545d1ae16-kube-api-access-52j9f\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.387490 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b86ef46c-86cf-472f-bf58-dc0545d1ae16-v4-0-config-system-session\") pod \"oauth-openshift-9645b9d-pv2l5\" (UID: \"b86ef46c-86cf-472f-bf58-dc0545d1ae16\") " pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.433380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:48 crc kubenswrapper[4749]: I0128 18:37:48.830941 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9645b9d-pv2l5"] Jan 28 18:37:48 crc kubenswrapper[4749]: W0128 18:37:48.838117 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86ef46c_86cf_472f_bf58_dc0545d1ae16.slice/crio-0651c29d40b4adac189fe0cf8fafc95d1042b46387f3a393fc8572ddcfdceaac WatchSource:0}: Error finding container 0651c29d40b4adac189fe0cf8fafc95d1042b46387f3a393fc8572ddcfdceaac: Status 404 returned error can't find the container with id 0651c29d40b4adac189fe0cf8fafc95d1042b46387f3a393fc8572ddcfdceaac Jan 28 18:37:49 crc kubenswrapper[4749]: I0128 18:37:49.772612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" event={"ID":"b86ef46c-86cf-472f-bf58-dc0545d1ae16","Type":"ContainerStarted","Data":"0651c29d40b4adac189fe0cf8fafc95d1042b46387f3a393fc8572ddcfdceaac"} Jan 28 18:37:50 crc kubenswrapper[4749]: I0128 18:37:50.784242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" event={"ID":"b86ef46c-86cf-472f-bf58-dc0545d1ae16","Type":"ContainerStarted","Data":"dd74bc7b220b0f7daf63cee888d2d3a0b79502106ccf6457d3d7b46240efc6b8"} Jan 28 18:37:50 crc kubenswrapper[4749]: I0128 18:37:50.784822 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:50 crc kubenswrapper[4749]: I0128 18:37:50.793181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" Jan 28 18:37:50 crc kubenswrapper[4749]: I0128 18:37:50.813476 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9645b9d-pv2l5" podStartSLOduration=32.813459385 podStartE2EDuration="32.813459385s" podCreationTimestamp="2026-01-28 18:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:37:50.810531175 +0000 UTC m=+138.822058000" watchObservedRunningTime="2026-01-28 18:37:50.813459385 +0000 UTC m=+138.824986160" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.081855 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.082752 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.165961 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.468425 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.469123 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.469207 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.469421 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71" gracePeriod=15 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.469478 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569" gracePeriod=15 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.469526 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16" gracePeriod=15 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.469562 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092" gracePeriod=15 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.469525 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c" gracePeriod=15 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.485503 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 18:37:52 crc kubenswrapper[4749]: E0128 18:37:52.485783 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.485799 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 18:37:52 crc kubenswrapper[4749]: E0128 18:37:52.485948 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.485962 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 18:37:52 crc kubenswrapper[4749]: E0128 18:37:52.485974 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.485982 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 18:37:52 crc kubenswrapper[4749]: E0128 18:37:52.485993 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486000 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 28 18:37:52 crc kubenswrapper[4749]: E0128 18:37:52.486011 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486018 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 18:37:52 crc kubenswrapper[4749]: E0128 18:37:52.486028 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486036 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 18:37:52 crc kubenswrapper[4749]: E0128 18:37:52.486046 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486053 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486172 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486187 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486196 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486205 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486215 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.486447 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.538589 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.538640 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.592072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.592918 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.593291 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.650693 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.650767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.650791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.650808 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.650872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.650994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.651063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.651113 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.752669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.752990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753077 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.752825 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753220 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753381 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.753899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.795437 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e45df95-120e-4171-8e11-01988340d6aa" containerID="dfdded196cca90d6ff8b284a123c5f701acd6e6e04ec96621f5fb1f55a9ee2d2" exitCode=0 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.795514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7e45df95-120e-4171-8e11-01988340d6aa","Type":"ContainerDied","Data":"dfdded196cca90d6ff8b284a123c5f701acd6e6e04ec96621f5fb1f55a9ee2d2"} Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.796060 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.796355 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.796843 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.797595 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.798733 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.799433 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569" exitCode=0 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.799462 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c" exitCode=0 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.799471 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16" exitCode=0 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.799480 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092" exitCode=2 Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.800364 4749 scope.go:117] "RemoveContainer" containerID="16cfb9272919aa566921d18cf2131a600913740c6ebeadb64266dca13633d136" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.830411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.830454 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.836124 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.836728 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.837135 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.837213 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.837620 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.838013 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.838452 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.838796 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.839074 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.839285 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.868965 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.869632 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.870036 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.870274 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.870464 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.870965 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.874219 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.874607 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.874974 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.875231 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.875541 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.918823 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.918886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.952562 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.953069 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.953357 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.953592 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.953918 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:52 crc kubenswrapper[4749]: I0128 18:37:52.954159 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.806598 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.843595 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.844052 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.844105 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.844341 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.844718 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.844962 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.845207 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.845472 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.845684 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.845853 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.846288 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:53 crc kubenswrapper[4749]: I0128 18:37:53.846550 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.131148 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.131699 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.132089 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.132353 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.132582 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.132863 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: E0128 18:37:54.261417 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:37:54Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:37:54Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:37:54Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:37:54Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2c1439ebdda893daf377def2d4397762658d82b531bb83f7ae41a4e7f26d4407\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:c044fa5dc076cb0fb053c5a676c39093e5fd06f6cc0eeaff8a747680c99c8b7f\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1675724519},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:40a0af9b58137c413272f3533763f7affd5db97e6ef410a6aeabce6d81a246ee\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7e9b6f6bdbfa69f6106bc85eaee51d908ede4be851b578362af443af6bf732a8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1202031349},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:364f5956de22b63db7dad4fcdd1f2740f71a482026c15aa3e2abebfbc5bf2fd7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d3d262f90dd0f3c3f809b45f327ca086741a47f73e44560b04787609f0f99567\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1187310829},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:420326d8488ceff2cde22ad8b85d739b0c254d47e703f7ddb1f08f77a48816a6\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:54817da328fa589491a3acbe80acdd88c0830dcc63aaafc08c3539925a1a3b03\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1180692192},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: E0128 18:37:54.261685 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: E0128 18:37:54.261846 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: E0128 18:37:54.261999 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: E0128 18:37:54.262152 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: E0128 18:37:54.262169 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.271946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-kubelet-dir\") pod \"7e45df95-120e-4171-8e11-01988340d6aa\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.272002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e45df95-120e-4171-8e11-01988340d6aa-kube-api-access\") pod \"7e45df95-120e-4171-8e11-01988340d6aa\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.272027 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-var-lock\") pod \"7e45df95-120e-4171-8e11-01988340d6aa\" (UID: \"7e45df95-120e-4171-8e11-01988340d6aa\") " Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.272060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7e45df95-120e-4171-8e11-01988340d6aa" (UID: "7e45df95-120e-4171-8e11-01988340d6aa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.272178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-var-lock" (OuterVolumeSpecName: "var-lock") pod "7e45df95-120e-4171-8e11-01988340d6aa" (UID: "7e45df95-120e-4171-8e11-01988340d6aa"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.272296 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.272320 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e45df95-120e-4171-8e11-01988340d6aa-var-lock\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.277113 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e45df95-120e-4171-8e11-01988340d6aa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7e45df95-120e-4171-8e11-01988340d6aa" (UID: "7e45df95-120e-4171-8e11-01988340d6aa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.373433 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e45df95-120e-4171-8e11-01988340d6aa-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.490382 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.491248 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.491773 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.492188 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.492583 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.492919 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.493208 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.523921 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.524553 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.524978 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.525238 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.525626 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.525868 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.526128 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.814755 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.815875 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71" exitCode=0 Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.815946 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e65dc445d223305d80984fc025b131c9bf442556d029d51239f6008707397d6" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.818045 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.820809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7e45df95-120e-4171-8e11-01988340d6aa","Type":"ContainerDied","Data":"f6aada7bc62fd480d1f750be1f0b6abc3bff587dbb7943ec7a94337b177da0bd"} Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.820860 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6aada7bc62fd480d1f750be1f0b6abc3bff587dbb7943ec7a94337b177da0bd" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.845304 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.845660 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.846006 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.846167 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.846433 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.846733 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.848261 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.849019 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.849404 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.849606 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.849802 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.850040 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.850266 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.850699 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.851510 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.907427 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.908011 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.908454 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.908747 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.909108 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.909384 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.909644 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.909888 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.940020 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.940613 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.941033 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.941287 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.941526 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.941730 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.942035 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.942498 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.979966 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.980068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.980060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.980086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.980122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.980149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.980634 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.980655 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:54 crc kubenswrapper[4749]: I0128 18:37:54.981073 4749 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.596184 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.596793 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.597199 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.597537 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.598002 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.598648 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.598952 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.599311 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.599716 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.634859 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.635516 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.635759 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.635972 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.636269 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.636641 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.636865 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.637146 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.637629 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.822711 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.823416 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.823724 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.824134 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.824424 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.824679 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.825355 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.825621 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.825864 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.826098 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.838026 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.838231 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.838447 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.838666 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.838857 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.839037 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.839252 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.839469 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.839629 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.955023 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.955650 4749 status_manager.go:851] "Failed to get status for pod" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" pod="openshift-marketplace/redhat-operators-25wll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-25wll\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.956437 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.956877 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.957385 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.957635 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.957898 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.958196 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.958479 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.958647 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.958786 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.992309 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.992909 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.993293 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.993617 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.993993 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.994239 4749 status_manager.go:851] "Failed to get status for pod" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" pod="openshift-marketplace/redhat-operators-25wll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-25wll\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.994825 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.995044 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.995300 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.995626 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:55 crc kubenswrapper[4749]: I0128 18:37:55.995926 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:56 crc kubenswrapper[4749]: I0128 18:37:56.877586 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 28 18:37:57 crc kubenswrapper[4749]: I0128 18:37:57.467542 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:37:57 crc kubenswrapper[4749]: I0128 18:37:57.468111 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:37:57 crc kubenswrapper[4749]: E0128 18:37:57.468895 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event=< Jan 28 18:37:57 crc kubenswrapper[4749]: &Event{ObjectMeta:{machine-config-daemon-698zt.188ef8fef1f74907 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-698zt,UID:1841c82d-7cd1-4c14-b54d-794bbb647776,APIVersion:v1,ResourceVersion:26556,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Jan 28 18:37:57 crc kubenswrapper[4749]: body: Jan 28 18:37:57 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 18:37:57.468080391 +0000 UTC m=+145.479607186,LastTimestamp:2026-01-28 18:37:57.468080391 +0000 UTC m=+145.479607186,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Jan 28 18:37:57 crc kubenswrapper[4749]: > Jan 28 18:37:57 crc kubenswrapper[4749]: E0128 18:37:57.498818 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:57 crc kubenswrapper[4749]: I0128 18:37:57.499264 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:57 crc kubenswrapper[4749]: I0128 18:37:57.834027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e3b038d6b95eee6c56a120bfad5c72ebfc2b86bbe74dcb4ed896f1e45af58f90"} Jan 28 18:37:58 crc kubenswrapper[4749]: I0128 18:37:58.841459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68"} Jan 28 18:37:59 crc kubenswrapper[4749]: E0128 18:37:59.358525 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event=< Jan 28 18:37:59 crc kubenswrapper[4749]: &Event{ObjectMeta:{machine-config-daemon-698zt.188ef8fef1f74907 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-698zt,UID:1841c82d-7cd1-4c14-b54d-794bbb647776,APIVersion:v1,ResourceVersion:26556,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Jan 28 18:37:59 crc kubenswrapper[4749]: body: Jan 28 18:37:59 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 18:37:57.468080391 +0000 UTC m=+145.479607186,LastTimestamp:2026-01-28 18:37:57.468080391 +0000 UTC m=+145.479607186,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Jan 28 18:37:59 crc kubenswrapper[4749]: > Jan 28 18:37:59 crc kubenswrapper[4749]: E0128 18:37:59.848269 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.849671 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.850393 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.850707 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.851176 4749 status_manager.go:851] "Failed to get status for pod" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" pod="openshift-marketplace/redhat-operators-25wll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-25wll\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.851801 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.852379 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.852905 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.853486 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:37:59 crc kubenswrapper[4749]: I0128 18:37:59.853934 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:01 crc kubenswrapper[4749]: E0128 18:38:01.100889 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:01 crc kubenswrapper[4749]: E0128 18:38:01.101984 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:01 crc kubenswrapper[4749]: E0128 18:38:01.102412 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:01 crc kubenswrapper[4749]: E0128 18:38:01.102816 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:01 crc kubenswrapper[4749]: E0128 18:38:01.103120 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:01 crc kubenswrapper[4749]: I0128 18:38:01.103151 4749 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 28 18:38:01 crc kubenswrapper[4749]: E0128 18:38:01.103405 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="200ms" Jan 28 18:38:01 crc kubenswrapper[4749]: E0128 18:38:01.304747 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="400ms" Jan 28 18:38:01 crc kubenswrapper[4749]: E0128 18:38:01.706319 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="800ms" Jan 28 18:38:02 crc kubenswrapper[4749]: E0128 18:38:02.507659 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="1.6s" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.873172 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.873620 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.873922 4749 status_manager.go:851] "Failed to get status for pod" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" pod="openshift-marketplace/redhat-operators-25wll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-25wll\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.874120 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.874450 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.874773 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.876268 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.876501 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:02 crc kubenswrapper[4749]: I0128 18:38:02.876771 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:04 crc kubenswrapper[4749]: E0128 18:38:04.109501 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="3.2s" Jan 28 18:38:04 crc kubenswrapper[4749]: E0128 18:38:04.455781 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:38:04Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:38:04Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:38:04Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T18:38:04Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2c1439ebdda893daf377def2d4397762658d82b531bb83f7ae41a4e7f26d4407\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:c044fa5dc076cb0fb053c5a676c39093e5fd06f6cc0eeaff8a747680c99c8b7f\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1675724519},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:40a0af9b58137c413272f3533763f7affd5db97e6ef410a6aeabce6d81a246ee\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7e9b6f6bdbfa69f6106bc85eaee51d908ede4be851b578362af443af6bf732a8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1202031349},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:364f5956de22b63db7dad4fcdd1f2740f71a482026c15aa3e2abebfbc5bf2fd7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d3d262f90dd0f3c3f809b45f327ca086741a47f73e44560b04787609f0f99567\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1187310829},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:420326d8488ceff2cde22ad8b85d739b0c254d47e703f7ddb1f08f77a48816a6\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:54817da328fa589491a3acbe80acdd88c0830dcc63aaafc08c3539925a1a3b03\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1180692192},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:04 crc kubenswrapper[4749]: E0128 18:38:04.456278 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:04 crc kubenswrapper[4749]: E0128 18:38:04.456624 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:04 crc kubenswrapper[4749]: E0128 18:38:04.456868 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:04 crc kubenswrapper[4749]: E0128 18:38:04.457094 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:04 crc kubenswrapper[4749]: E0128 18:38:04.457116 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.871029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.874393 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.874883 4749 status_manager.go:851] "Failed to get status for pod" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" pod="openshift-marketplace/redhat-operators-25wll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-25wll\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.875289 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.875937 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.876103 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.876252 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.876460 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.876612 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.876767 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.882185 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.882237 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6dcb8a332b44be3152800d1d89f6a56f6d0851de934ab0085a81a1eaa2afc002" exitCode=1 Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.882269 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6dcb8a332b44be3152800d1d89f6a56f6d0851de934ab0085a81a1eaa2afc002"} Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.882784 4749 scope.go:117] "RemoveContainer" containerID="6dcb8a332b44be3152800d1d89f6a56f6d0851de934ab0085a81a1eaa2afc002" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.883733 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.883989 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.884196 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.884357 4749 status_manager.go:851] "Failed to get status for pod" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" pod="openshift-marketplace/redhat-operators-25wll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-25wll\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.884547 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.884773 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.885031 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.885311 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.885569 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.885771 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.887542 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.887580 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:05 crc kubenswrapper[4749]: E0128 18:38:05.888023 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:05 crc kubenswrapper[4749]: I0128 18:38:05.888637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.417988 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.889400 4749 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d06c3930b245f76a5887c5a0fb7f6223fbe996332509adb0ff7b40986b577289" exitCode=0 Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.889456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d06c3930b245f76a5887c5a0fb7f6223fbe996332509adb0ff7b40986b577289"} Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.889534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"acf1314d1588e2d4eaeb6a6b39929dab37512bec7285baa247e208083d5e6682"} Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.889876 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.889896 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.890498 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: E0128 18:38:06.890561 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.891055 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.891511 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.891711 4749 status_manager.go:851] "Failed to get status for pod" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" pod="openshift-marketplace/redhat-operators-25wll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-25wll\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.891904 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.892159 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.892565 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.892863 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.893116 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.893166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dab4b77a02b0072c9cb635683e220a1f1c9c16c135674b3afa4fe51f046f7325"} Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.893165 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.893447 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.893737 4749 status_manager.go:851] "Failed to get status for pod" podUID="01642193-d926-44c5-908a-98716476032b" pod="openshift-marketplace/certified-operators-cn95r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cn95r\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.893991 4749 status_manager.go:851] "Failed to get status for pod" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" pod="openshift-marketplace/community-operators-jbrpd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jbrpd\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.894279 4749 status_manager.go:851] "Failed to get status for pod" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" pod="openshift-marketplace/redhat-marketplace-whzp4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-whzp4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.894928 4749 status_manager.go:851] "Failed to get status for pod" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" pod="openshift-marketplace/community-operators-rhv8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rhv8w\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.895185 4749 status_manager.go:851] "Failed to get status for pod" podUID="7e45df95-120e-4171-8e11-01988340d6aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.895451 4749 status_manager.go:851] "Failed to get status for pod" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" pod="openshift-marketplace/redhat-operators-25wll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-25wll\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.895709 4749 status_manager.go:851] "Failed to get status for pod" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" pod="openshift-marketplace/redhat-operators-mk2w9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mk2w9\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.895944 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.896193 4749 status_manager.go:851] "Failed to get status for pod" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" pod="openshift-marketplace/certified-operators-4ngh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4ngh4\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:06 crc kubenswrapper[4749]: I0128 18:38:06.896976 4749 status_manager.go:851] "Failed to get status for pod" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" pod="openshift-marketplace/redhat-marketplace-tndcr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-tndcr\": dial tcp 38.102.83.50:6443: connect: connection refused" Jan 28 18:38:07 crc kubenswrapper[4749]: I0128 18:38:07.905810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d13943e10196dd5f6c64da1469e404b0f99f69db8ff2cd2c6079335c37143592"} Jan 28 18:38:07 crc kubenswrapper[4749]: I0128 18:38:07.907114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"20d9519e9fc4bdf722e4a10a8375b5c970995c3e3c7c07e59f64340c2087a73b"} Jan 28 18:38:07 crc kubenswrapper[4749]: I0128 18:38:07.907373 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"def10eaf279f29616a423e2b0b908e9bc6280d2c941fc0fe7c1330561ad382a0"} Jan 28 18:38:07 crc kubenswrapper[4749]: I0128 18:38:07.907425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f55a05993806a65610f55c751b1ec5130f4dd9d7e1c598ab0c21381c648408bc"} Jan 28 18:38:08 crc kubenswrapper[4749]: I0128 18:38:08.917727 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"25b8e36a853ae857561571da21266392f21d2cabf8a833115e3cea20f2705109"} Jan 28 18:38:08 crc kubenswrapper[4749]: I0128 18:38:08.917910 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:08 crc kubenswrapper[4749]: I0128 18:38:08.918051 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:08 crc kubenswrapper[4749]: I0128 18:38:08.918075 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:10 crc kubenswrapper[4749]: I0128 18:38:10.889366 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:10 crc kubenswrapper[4749]: I0128 18:38:10.889693 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:10 crc kubenswrapper[4749]: I0128 18:38:10.893899 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:13 crc kubenswrapper[4749]: I0128 18:38:13.927560 4749 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:14 crc kubenswrapper[4749]: I0128 18:38:14.113449 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="00b0f299-a889-4f46-aa97-9648e14198b7" Jan 28 18:38:14 crc kubenswrapper[4749]: I0128 18:38:14.367937 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:38:14 crc kubenswrapper[4749]: I0128 18:38:14.948747 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:14 crc kubenswrapper[4749]: I0128 18:38:14.949624 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:14 crc kubenswrapper[4749]: I0128 18:38:14.952123 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="00b0f299-a889-4f46-aa97-9648e14198b7" Jan 28 18:38:14 crc kubenswrapper[4749]: I0128 18:38:14.952304 4749 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://f55a05993806a65610f55c751b1ec5130f4dd9d7e1c598ab0c21381c648408bc" Jan 28 18:38:14 crc kubenswrapper[4749]: I0128 18:38:14.952353 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:15 crc kubenswrapper[4749]: I0128 18:38:15.954963 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:15 crc kubenswrapper[4749]: I0128 18:38:15.955003 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4d46ae1a-7005-4413-8fd3-5c7ef768cefd" Jan 28 18:38:15 crc kubenswrapper[4749]: I0128 18:38:15.957363 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="00b0f299-a889-4f46-aa97-9648e14198b7" Jan 28 18:38:16 crc kubenswrapper[4749]: I0128 18:38:16.418869 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:38:16 crc kubenswrapper[4749]: I0128 18:38:16.422872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:38:16 crc kubenswrapper[4749]: I0128 18:38:16.965554 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 18:38:23 crc kubenswrapper[4749]: I0128 18:38:23.078004 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 28 18:38:23 crc kubenswrapper[4749]: I0128 18:38:23.820268 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 28 18:38:24 crc kubenswrapper[4749]: I0128 18:38:24.215962 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 18:38:24 crc kubenswrapper[4749]: I0128 18:38:24.227439 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.167809 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.220532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.340188 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.485243 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.595660 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.737797 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.815089 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.815131 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.834658 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.846627 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 28 18:38:25 crc kubenswrapper[4749]: I0128 18:38:25.873742 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.101890 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.115970 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.276909 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.392234 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.410484 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.454215 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.584966 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.643866 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.836899 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.846642 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.851908 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 28 18:38:26 crc kubenswrapper[4749]: I0128 18:38:26.946869 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.150724 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.163384 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.263099 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.321287 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.353437 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.467520 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.467585 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.599564 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.713617 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.759262 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.872022 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 28 18:38:27 crc kubenswrapper[4749]: I0128 18:38:27.927774 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.038936 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.039059 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.041534 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.052356 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.071127 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.235452 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.318844 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.357833 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.380420 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.395117 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.462244 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.685531 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.692487 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.786532 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.802267 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.823718 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.884408 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.914749 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.924993 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 28 18:38:28 crc kubenswrapper[4749]: I0128 18:38:28.988293 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.107573 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.219784 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.296754 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.336246 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.349695 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.414642 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.437725 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.451112 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.499922 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.535598 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.537462 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.539863 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.579976 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.645893 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.696942 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.755274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.855572 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.978189 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 28 18:38:29 crc kubenswrapper[4749]: I0128 18:38:29.995254 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.088117 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.107568 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.190598 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.349620 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.353817 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.425985 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.567897 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.588668 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.644228 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.665274 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.744804 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.810216 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.871729 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 28 18:38:30 crc kubenswrapper[4749]: I0128 18:38:30.922388 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.079859 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.118927 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.126291 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.169067 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.238943 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.241037 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.247074 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.254831 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.269629 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.291445 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.297901 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.371197 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.427602 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.460832 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.463424 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.468028 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.491522 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.610891 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.618956 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.648522 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.724959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.743152 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.750697 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.869137 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.921847 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.970466 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.974786 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.981451 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 28 18:38:31 crc kubenswrapper[4749]: I0128 18:38:31.989037 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.089025 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.148018 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.186865 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.187444 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.205065 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.222911 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.265588 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.291870 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.421498 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.445442 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.445728 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.470624 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.615955 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 28 18:38:32 crc kubenswrapper[4749]: I0128 18:38:32.920133 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.011871 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.015191 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.082049 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.098754 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.152429 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.238768 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.240464 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.292145 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.328494 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.413479 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.479280 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.482487 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.504990 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.547829 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.637192 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.637508 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.644560 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.684121 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.763930 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.836490 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.870482 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 28 18:38:33 crc kubenswrapper[4749]: I0128 18:38:33.944523 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.018214 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.192472 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.420936 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.442392 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.535442 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.554932 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.678798 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.726038 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.778466 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.784506 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.790713 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.798380 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.857368 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.878314 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.892292 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.899958 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 28 18:38:34 crc kubenswrapper[4749]: I0128 18:38:34.982519 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.033119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.052775 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.076387 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.134059 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.157245 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.209094 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.244031 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.377292 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.432865 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.450255 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.453166 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.522471 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.602064 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.639771 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.654879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.669465 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.720721 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.862168 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.865131 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 28 18:38:35 crc kubenswrapper[4749]: I0128 18:38:35.944585 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.016457 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.034397 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.060749 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.084719 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.102040 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.186210 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.315454 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.318946 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.363204 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.376226 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.391564 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.409272 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.419539 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.511545 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.660457 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.663605 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.663661 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.672533 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.707750 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.750564 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.811243 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.869400 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.895911 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 28 18:38:36 crc kubenswrapper[4749]: I0128 18:38:36.949802 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.019450 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.052944 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.098978 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.202061 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.273385 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.348786 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.378403 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.611470 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.703019 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.740570 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.878586 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.980249 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 28 18:38:37 crc kubenswrapper[4749]: I0128 18:38:37.980406 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.136510 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.157864 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.221677 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.319858 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.345125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.358545 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.362734 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.362789 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.366380 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.382933 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.382913642 podStartE2EDuration="25.382913642s" podCreationTimestamp="2026-01-28 18:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:38:38.377966374 +0000 UTC m=+186.389493159" watchObservedRunningTime="2026-01-28 18:38:38.382913642 +0000 UTC m=+186.394440427" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.482162 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.624512 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.868099 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.872062 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.877429 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.890145 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.906687 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.923350 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 18:38:38 crc kubenswrapper[4749]: I0128 18:38:38.968588 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 28 18:38:39 crc kubenswrapper[4749]: I0128 18:38:39.055452 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 28 18:38:39 crc kubenswrapper[4749]: I0128 18:38:39.140441 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 28 18:38:39 crc kubenswrapper[4749]: I0128 18:38:39.261796 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 28 18:38:39 crc kubenswrapper[4749]: I0128 18:38:39.387673 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 28 18:38:39 crc kubenswrapper[4749]: I0128 18:38:39.642125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 28 18:38:39 crc kubenswrapper[4749]: I0128 18:38:39.913234 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.498411 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm"] Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.498672 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" podUID="71a4037e-72f6-49aa-9853-e890e627e4e9" containerName="controller-manager" containerID="cri-o://86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8" gracePeriod=30 Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.504016 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc"] Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.504275 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" podUID="bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" containerName="route-controller-manager" containerID="cri-o://5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919" gracePeriod=30 Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.915067 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.923699 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.981812 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-client-ca\") pod \"71a4037e-72f6-49aa-9853-e890e627e4e9\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.982736 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71a4037e-72f6-49aa-9853-e890e627e4e9-serving-cert\") pod \"71a4037e-72f6-49aa-9853-e890e627e4e9\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.982894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "71a4037e-72f6-49aa-9853-e890e627e4e9" (UID: "71a4037e-72f6-49aa-9853-e890e627e4e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.983082 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-985lj\" (UniqueName: \"kubernetes.io/projected/71a4037e-72f6-49aa-9853-e890e627e4e9-kube-api-access-985lj\") pod \"71a4037e-72f6-49aa-9853-e890e627e4e9\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.983225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-config\") pod \"71a4037e-72f6-49aa-9853-e890e627e4e9\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.983385 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-proxy-ca-bundles\") pod \"71a4037e-72f6-49aa-9853-e890e627e4e9\" (UID: \"71a4037e-72f6-49aa-9853-e890e627e4e9\") " Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.983884 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-config" (OuterVolumeSpecName: "config") pod "71a4037e-72f6-49aa-9853-e890e627e4e9" (UID: "71a4037e-72f6-49aa-9853-e890e627e4e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.983926 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "71a4037e-72f6-49aa-9853-e890e627e4e9" (UID: "71a4037e-72f6-49aa-9853-e890e627e4e9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.984476 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.984511 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.984527 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71a4037e-72f6-49aa-9853-e890e627e4e9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.989793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a4037e-72f6-49aa-9853-e890e627e4e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "71a4037e-72f6-49aa-9853-e890e627e4e9" (UID: "71a4037e-72f6-49aa-9853-e890e627e4e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:38:40 crc kubenswrapper[4749]: I0128 18:38:40.989989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a4037e-72f6-49aa-9853-e890e627e4e9-kube-api-access-985lj" (OuterVolumeSpecName: "kube-api-access-985lj") pod "71a4037e-72f6-49aa-9853-e890e627e4e9" (UID: "71a4037e-72f6-49aa-9853-e890e627e4e9"). InnerVolumeSpecName "kube-api-access-985lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.045977 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.085589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh8gh\" (UniqueName: \"kubernetes.io/projected/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-kube-api-access-rh8gh\") pod \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.085679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-config\") pod \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.085745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-client-ca\") pod \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.085780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-serving-cert\") pod \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\" (UID: \"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531\") " Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.086047 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-985lj\" (UniqueName: \"kubernetes.io/projected/71a4037e-72f6-49aa-9853-e890e627e4e9-kube-api-access-985lj\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.086064 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71a4037e-72f6-49aa-9853-e890e627e4e9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.087213 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" (UID: "bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.087377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-config" (OuterVolumeSpecName: "config") pod "bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" (UID: "bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.089150 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-kube-api-access-rh8gh" (OuterVolumeSpecName: "kube-api-access-rh8gh") pod "bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" (UID: "bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531"). InnerVolumeSpecName "kube-api-access-rh8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.089226 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" (UID: "bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.108243 4749 generic.go:334] "Generic (PLEG): container finished" podID="bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" containerID="5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919" exitCode=0 Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.108379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" event={"ID":"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531","Type":"ContainerDied","Data":"5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919"} Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.108486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" event={"ID":"bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531","Type":"ContainerDied","Data":"de3dde4a5c8375460bb1e471dff84dce528fa1db2ef9b4d17aa43e0b48d7e70f"} Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.108552 4749 scope.go:117] "RemoveContainer" containerID="5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.108799 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.110947 4749 generic.go:334] "Generic (PLEG): container finished" podID="71a4037e-72f6-49aa-9853-e890e627e4e9" containerID="86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8" exitCode=0 Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.111024 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" event={"ID":"71a4037e-72f6-49aa-9853-e890e627e4e9","Type":"ContainerDied","Data":"86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8"} Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.111078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" event={"ID":"71a4037e-72f6-49aa-9853-e890e627e4e9","Type":"ContainerDied","Data":"632673ba54a2901a03d46d203eded79606676d04ee6b0647c34b2f888ddcca5d"} Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.111214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.134677 4749 scope.go:117] "RemoveContainer" containerID="5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919" Jan 28 18:38:41 crc kubenswrapper[4749]: E0128 18:38:41.135304 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919\": container with ID starting with 5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919 not found: ID does not exist" containerID="5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.135372 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919"} err="failed to get container status \"5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919\": rpc error: code = NotFound desc = could not find container \"5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919\": container with ID starting with 5f3e9dc9c61c481b508872cf7d5a8c7d184db230c87b529b322b24f3eef1c919 not found: ID does not exist" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.135392 4749 scope.go:117] "RemoveContainer" containerID="86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.148168 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm"] Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.151568 4749 scope.go:117] "RemoveContainer" containerID="86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8" Jan 28 18:38:41 crc kubenswrapper[4749]: E0128 18:38:41.152009 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8\": container with ID starting with 86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8 not found: ID does not exist" containerID="86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.152062 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8"} err="failed to get container status \"86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8\": rpc error: code = NotFound desc = could not find container \"86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8\": container with ID starting with 86292af00fe9607fa4cd32a1dc09e4f4302d6789c0207cd20cce860b4f46f5e8 not found: ID does not exist" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.152294 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9dcf4c7fd-22dcm"] Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.161484 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc"] Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.167118 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4dcf45dc-w8llc"] Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.187034 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh8gh\" (UniqueName: \"kubernetes.io/projected/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-kube-api-access-rh8gh\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.187067 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.187079 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:41 crc kubenswrapper[4749]: I0128 18:38:41.187093 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.195643 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8"] Jan 28 18:38:42 crc kubenswrapper[4749]: E0128 18:38:42.197835 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a4037e-72f6-49aa-9853-e890e627e4e9" containerName="controller-manager" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.197868 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a4037e-72f6-49aa-9853-e890e627e4e9" containerName="controller-manager" Jan 28 18:38:42 crc kubenswrapper[4749]: E0128 18:38:42.197928 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e45df95-120e-4171-8e11-01988340d6aa" containerName="installer" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.197940 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e45df95-120e-4171-8e11-01988340d6aa" containerName="installer" Jan 28 18:38:42 crc kubenswrapper[4749]: E0128 18:38:42.197998 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" containerName="route-controller-manager" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.198012 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" containerName="route-controller-manager" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.198300 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a4037e-72f6-49aa-9853-e890e627e4e9" containerName="controller-manager" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.198379 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e45df95-120e-4171-8e11-01988340d6aa" containerName="installer" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.198394 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" containerName="route-controller-manager" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.199176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.203025 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.203255 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.203493 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.204196 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.204474 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.204636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.209237 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-ndxjp"] Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.210368 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.212705 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.214246 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.214955 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.215352 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.215881 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.216503 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.222048 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8"] Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.224558 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.226622 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-ndxjp"] Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299770 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-config\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-serving-cert\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-config\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a750393-b0c7-48d1-99c8-c9c26ec13d02-serving-cert\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299883 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-proxy-ca-bundles\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299902 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-client-ca\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-client-ca\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvgbr\" (UniqueName: \"kubernetes.io/projected/0a750393-b0c7-48d1-99c8-c9c26ec13d02-kube-api-access-gvgbr\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.299962 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tth6\" (UniqueName: \"kubernetes.io/projected/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-kube-api-access-9tth6\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.400940 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvgbr\" (UniqueName: \"kubernetes.io/projected/0a750393-b0c7-48d1-99c8-c9c26ec13d02-kube-api-access-gvgbr\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.400992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tth6\" (UniqueName: \"kubernetes.io/projected/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-kube-api-access-9tth6\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.401038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-config\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.401067 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-serving-cert\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.401088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-config\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.401112 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a750393-b0c7-48d1-99c8-c9c26ec13d02-serving-cert\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.401140 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-proxy-ca-bundles\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.401163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-client-ca\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.401189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-client-ca\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.402372 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-client-ca\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.402457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-client-ca\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.402508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-proxy-ca-bundles\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.402780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-config\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.404453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-config\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.405824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-serving-cert\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.406218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a750393-b0c7-48d1-99c8-c9c26ec13d02-serving-cert\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.416013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvgbr\" (UniqueName: \"kubernetes.io/projected/0a750393-b0c7-48d1-99c8-c9c26ec13d02-kube-api-access-gvgbr\") pod \"route-controller-manager-6669c9bf4-r8gw8\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.416802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tth6\" (UniqueName: \"kubernetes.io/projected/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-kube-api-access-9tth6\") pod \"controller-manager-79558b8d74-ndxjp\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.543077 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.546588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.880046 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a4037e-72f6-49aa-9853-e890e627e4e9" path="/var/lib/kubelet/pods/71a4037e-72f6-49aa-9853-e890e627e4e9/volumes" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.881790 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531" path="/var/lib/kubelet/pods/bf7850fe-8b5c-407a-9d2f-f2c7b8f9e531/volumes" Jan 28 18:38:42 crc kubenswrapper[4749]: I0128 18:38:42.985145 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-ndxjp"] Jan 28 18:38:43 crc kubenswrapper[4749]: I0128 18:38:43.031509 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8"] Jan 28 18:38:43 crc kubenswrapper[4749]: W0128 18:38:43.034434 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a750393_b0c7_48d1_99c8_c9c26ec13d02.slice/crio-df6ee18989cd03f270efecf0ff694d1c4319bffe41a3eeaeb7960c65c5738891 WatchSource:0}: Error finding container df6ee18989cd03f270efecf0ff694d1c4319bffe41a3eeaeb7960c65c5738891: Status 404 returned error can't find the container with id df6ee18989cd03f270efecf0ff694d1c4319bffe41a3eeaeb7960c65c5738891 Jan 28 18:38:43 crc kubenswrapper[4749]: I0128 18:38:43.130595 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" event={"ID":"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd","Type":"ContainerStarted","Data":"fb34de60e4d86cec526b0b73a0c582d5c8ca84a8fe3d2c57e706aacf12416592"} Jan 28 18:38:43 crc kubenswrapper[4749]: I0128 18:38:43.131787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" event={"ID":"0a750393-b0c7-48d1-99c8-c9c26ec13d02","Type":"ContainerStarted","Data":"df6ee18989cd03f270efecf0ff694d1c4319bffe41a3eeaeb7960c65c5738891"} Jan 28 18:38:44 crc kubenswrapper[4749]: I0128 18:38:44.140480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" event={"ID":"0a750393-b0c7-48d1-99c8-c9c26ec13d02","Type":"ContainerStarted","Data":"2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40"} Jan 28 18:38:44 crc kubenswrapper[4749]: I0128 18:38:44.141280 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:44 crc kubenswrapper[4749]: I0128 18:38:44.144164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" event={"ID":"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd","Type":"ContainerStarted","Data":"96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae"} Jan 28 18:38:44 crc kubenswrapper[4749]: I0128 18:38:44.144359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:44 crc kubenswrapper[4749]: I0128 18:38:44.146873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:38:44 crc kubenswrapper[4749]: I0128 18:38:44.151145 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:38:44 crc kubenswrapper[4749]: I0128 18:38:44.161815 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" podStartSLOduration=4.161795273 podStartE2EDuration="4.161795273s" podCreationTimestamp="2026-01-28 18:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:38:44.160565853 +0000 UTC m=+192.172092638" watchObservedRunningTime="2026-01-28 18:38:44.161795273 +0000 UTC m=+192.173322058" Jan 28 18:38:44 crc kubenswrapper[4749]: I0128 18:38:44.202343 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" podStartSLOduration=4.202303523 podStartE2EDuration="4.202303523s" podCreationTimestamp="2026-01-28 18:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:38:44.200599893 +0000 UTC m=+192.212126678" watchObservedRunningTime="2026-01-28 18:38:44.202303523 +0000 UTC m=+192.213830298" Jan 28 18:38:47 crc kubenswrapper[4749]: I0128 18:38:47.433528 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 18:38:47 crc kubenswrapper[4749]: I0128 18:38:47.434244 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68" gracePeriod=5 Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.019172 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.019596 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139483 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139504 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139574 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139594 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139809 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139869 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.139922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.147023 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.193447 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.193827 4749 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68" exitCode=137 Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.193896 4749 scope.go:117] "RemoveContainer" containerID="9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.193954 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.195725 4749 generic.go:334] "Generic (PLEG): container finished" podID="d0092231-157d-498a-a811-c533dccee8ce" containerID="bb5be9913a3a37f304a7e73e831a5e4c4930b47ee3d0f71da8c7e5d162241abe" exitCode=0 Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.195773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" event={"ID":"d0092231-157d-498a-a811-c533dccee8ce","Type":"ContainerDied","Data":"bb5be9913a3a37f304a7e73e831a5e4c4930b47ee3d0f71da8c7e5d162241abe"} Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.196250 4749 scope.go:117] "RemoveContainer" containerID="bb5be9913a3a37f304a7e73e831a5e4c4930b47ee3d0f71da8c7e5d162241abe" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.219603 4749 scope.go:117] "RemoveContainer" containerID="9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68" Jan 28 18:38:53 crc kubenswrapper[4749]: E0128 18:38:53.220128 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68\": container with ID starting with 9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68 not found: ID does not exist" containerID="9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.220161 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68"} err="failed to get container status \"9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68\": rpc error: code = NotFound desc = could not find container \"9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68\": container with ID starting with 9756feb2f898d65f45e29296b891d5de0023b1603bb3919cb7c8b476b1c00d68 not found: ID does not exist" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.241375 4749 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.241445 4749 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.241477 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:53 crc kubenswrapper[4749]: I0128 18:38:53.241500 4749 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 28 18:38:54 crc kubenswrapper[4749]: I0128 18:38:54.202623 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" event={"ID":"d0092231-157d-498a-a811-c533dccee8ce","Type":"ContainerStarted","Data":"8c06768f68a39cdc21d01da138ac9a991041ff068b507d6672f5d6e48354a206"} Jan 28 18:38:54 crc kubenswrapper[4749]: I0128 18:38:54.203022 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:38:54 crc kubenswrapper[4749]: I0128 18:38:54.207885 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:38:54 crc kubenswrapper[4749]: I0128 18:38:54.877588 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 28 18:38:57 crc kubenswrapper[4749]: I0128 18:38:57.467387 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:38:57 crc kubenswrapper[4749]: I0128 18:38:57.467715 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:38:57 crc kubenswrapper[4749]: I0128 18:38:57.467757 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:38:57 crc kubenswrapper[4749]: I0128 18:38:57.468281 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4776e4bbf405860b0865ff2250d0eef5141f9c0b4c47049e6cb2d3cde9522949"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 18:38:57 crc kubenswrapper[4749]: I0128 18:38:57.468348 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://4776e4bbf405860b0865ff2250d0eef5141f9c0b4c47049e6cb2d3cde9522949" gracePeriod=600 Jan 28 18:38:58 crc kubenswrapper[4749]: I0128 18:38:58.225034 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="4776e4bbf405860b0865ff2250d0eef5141f9c0b4c47049e6cb2d3cde9522949" exitCode=0 Jan 28 18:38:58 crc kubenswrapper[4749]: I0128 18:38:58.225120 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"4776e4bbf405860b0865ff2250d0eef5141f9c0b4c47049e6cb2d3cde9522949"} Jan 28 18:38:58 crc kubenswrapper[4749]: I0128 18:38:58.225421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"8c194349366700299c12535062fa3cdc45843fcdc3dd4a7242dce6417a2b9ece"} Jan 28 18:39:00 crc kubenswrapper[4749]: I0128 18:39:00.488394 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-ndxjp"] Jan 28 18:39:00 crc kubenswrapper[4749]: I0128 18:39:00.488860 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" podUID="034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" containerName="controller-manager" containerID="cri-o://96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae" gracePeriod=30 Jan 28 18:39:00 crc kubenswrapper[4749]: I0128 18:39:00.507045 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8"] Jan 28 18:39:00 crc kubenswrapper[4749]: I0128 18:39:00.507729 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" podUID="0a750393-b0c7-48d1-99c8-c9c26ec13d02" containerName="route-controller-manager" containerID="cri-o://2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40" gracePeriod=30 Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.022360 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.113685 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.159892 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-client-ca\") pod \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.159978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a750393-b0c7-48d1-99c8-c9c26ec13d02-serving-cert\") pod \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.160012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-config\") pod \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.160058 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvgbr\" (UniqueName: \"kubernetes.io/projected/0a750393-b0c7-48d1-99c8-c9c26ec13d02-kube-api-access-gvgbr\") pod \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\" (UID: \"0a750393-b0c7-48d1-99c8-c9c26ec13d02\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.161084 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-client-ca" (OuterVolumeSpecName: "client-ca") pod "0a750393-b0c7-48d1-99c8-c9c26ec13d02" (UID: "0a750393-b0c7-48d1-99c8-c9c26ec13d02"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.161965 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-config" (OuterVolumeSpecName: "config") pod "0a750393-b0c7-48d1-99c8-c9c26ec13d02" (UID: "0a750393-b0c7-48d1-99c8-c9c26ec13d02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.166207 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a750393-b0c7-48d1-99c8-c9c26ec13d02-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0a750393-b0c7-48d1-99c8-c9c26ec13d02" (UID: "0a750393-b0c7-48d1-99c8-c9c26ec13d02"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.166217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a750393-b0c7-48d1-99c8-c9c26ec13d02-kube-api-access-gvgbr" (OuterVolumeSpecName: "kube-api-access-gvgbr") pod "0a750393-b0c7-48d1-99c8-c9c26ec13d02" (UID: "0a750393-b0c7-48d1-99c8-c9c26ec13d02"). InnerVolumeSpecName "kube-api-access-gvgbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.239953 4749 generic.go:334] "Generic (PLEG): container finished" podID="0a750393-b0c7-48d1-99c8-c9c26ec13d02" containerID="2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40" exitCode=0 Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.239995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" event={"ID":"0a750393-b0c7-48d1-99c8-c9c26ec13d02","Type":"ContainerDied","Data":"2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40"} Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.240528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" event={"ID":"0a750393-b0c7-48d1-99c8-c9c26ec13d02","Type":"ContainerDied","Data":"df6ee18989cd03f270efecf0ff694d1c4319bffe41a3eeaeb7960c65c5738891"} Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.240042 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.240557 4749 scope.go:117] "RemoveContainer" containerID="2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.241776 4749 generic.go:334] "Generic (PLEG): container finished" podID="034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" containerID="96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae" exitCode=0 Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.241815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" event={"ID":"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd","Type":"ContainerDied","Data":"96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae"} Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.241841 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.241860 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79558b8d74-ndxjp" event={"ID":"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd","Type":"ContainerDied","Data":"fb34de60e4d86cec526b0b73a0c582d5c8ca84a8fe3d2c57e706aacf12416592"} Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.255444 4749 scope.go:117] "RemoveContainer" containerID="2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40" Jan 28 18:39:01 crc kubenswrapper[4749]: E0128 18:39:01.255819 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40\": container with ID starting with 2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40 not found: ID does not exist" containerID="2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.255863 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40"} err="failed to get container status \"2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40\": rpc error: code = NotFound desc = could not find container \"2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40\": container with ID starting with 2e2e8ede3c94e18f27c4c8e5b43d37daaed1439b44b1527ac48b4b70ab8d7e40 not found: ID does not exist" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.255886 4749 scope.go:117] "RemoveContainer" containerID="96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.260836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-config\") pod \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.260911 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-proxy-ca-bundles\") pod \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.260954 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tth6\" (UniqueName: \"kubernetes.io/projected/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-kube-api-access-9tth6\") pod \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.260987 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-serving-cert\") pod \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.261066 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-client-ca\") pod \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\" (UID: \"034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd\") " Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.261362 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.261378 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a750393-b0c7-48d1-99c8-c9c26ec13d02-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.261389 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a750393-b0c7-48d1-99c8-c9c26ec13d02-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.261400 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvgbr\" (UniqueName: \"kubernetes.io/projected/0a750393-b0c7-48d1-99c8-c9c26ec13d02-kube-api-access-gvgbr\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.261745 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-client-ca" (OuterVolumeSpecName: "client-ca") pod "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" (UID: "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.262021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-config" (OuterVolumeSpecName: "config") pod "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" (UID: "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.262091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" (UID: "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.264588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" (UID: "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.264857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-kube-api-access-9tth6" (OuterVolumeSpecName: "kube-api-access-9tth6") pod "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" (UID: "034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd"). InnerVolumeSpecName "kube-api-access-9tth6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.272452 4749 scope.go:117] "RemoveContainer" containerID="96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae" Jan 28 18:39:01 crc kubenswrapper[4749]: E0128 18:39:01.272977 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae\": container with ID starting with 96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae not found: ID does not exist" containerID="96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.273148 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae"} err="failed to get container status \"96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae\": rpc error: code = NotFound desc = could not find container \"96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae\": container with ID starting with 96f9f9178e7c67dc5af61d51db7023a6ca67cb22ad4916466a0c8f13a7aef3ae not found: ID does not exist" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.277038 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8"] Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.280005 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-r8gw8"] Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.363034 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.363076 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tth6\" (UniqueName: \"kubernetes.io/projected/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-kube-api-access-9tth6\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.363118 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.363131 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.363144 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.568584 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-ndxjp"] Jan 28 18:39:01 crc kubenswrapper[4749]: I0128 18:39:01.573915 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-ndxjp"] Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.197187 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp"] Jan 28 18:39:02 crc kubenswrapper[4749]: E0128 18:39:02.197480 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" containerName="controller-manager" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.197498 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" containerName="controller-manager" Jan 28 18:39:02 crc kubenswrapper[4749]: E0128 18:39:02.197516 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.197523 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 18:39:02 crc kubenswrapper[4749]: E0128 18:39:02.197542 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a750393-b0c7-48d1-99c8-c9c26ec13d02" containerName="route-controller-manager" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.197549 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a750393-b0c7-48d1-99c8-c9c26ec13d02" containerName="route-controller-manager" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.197663 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.197678 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" containerName="controller-manager" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.197691 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a750393-b0c7-48d1-99c8-c9c26ec13d02" containerName="route-controller-manager" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.198133 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.200298 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.200509 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.200583 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.200928 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7566f66c88-9jjkw"] Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.201086 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.201655 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.208853 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp"] Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.209312 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.209541 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.209808 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.209959 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.210099 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.210225 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.215543 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.215686 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.215795 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.217455 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7566f66c88-9jjkw"] Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-config\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjt4\" (UniqueName: \"kubernetes.io/projected/2e97ae33-5cee-47b5-98dd-7d3979437654-kube-api-access-sqjt4\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8de8560-b540-4a79-a828-77ab59f06f95-serving-cert\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373642 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dhcn\" (UniqueName: \"kubernetes.io/projected/a8de8560-b540-4a79-a828-77ab59f06f95-kube-api-access-6dhcn\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e97ae33-5cee-47b5-98dd-7d3979437654-serving-cert\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-client-ca\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-client-ca\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-proxy-ca-bundles\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.373789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-config\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.475628 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e97ae33-5cee-47b5-98dd-7d3979437654-serving-cert\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.475974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-client-ca\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.476031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-client-ca\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.476053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-proxy-ca-bundles\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.476094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-config\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.476111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-config\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.476133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjt4\" (UniqueName: \"kubernetes.io/projected/2e97ae33-5cee-47b5-98dd-7d3979437654-kube-api-access-sqjt4\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.476263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8de8560-b540-4a79-a828-77ab59f06f95-serving-cert\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.476294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dhcn\" (UniqueName: \"kubernetes.io/projected/a8de8560-b540-4a79-a828-77ab59f06f95-kube-api-access-6dhcn\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.477260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-client-ca\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.477581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-config\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.478969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-proxy-ca-bundles\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.479395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-config\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.480959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-client-ca\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.482442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8de8560-b540-4a79-a828-77ab59f06f95-serving-cert\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.483169 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e97ae33-5cee-47b5-98dd-7d3979437654-serving-cert\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.491450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dhcn\" (UniqueName: \"kubernetes.io/projected/a8de8560-b540-4a79-a828-77ab59f06f95-kube-api-access-6dhcn\") pod \"route-controller-manager-665b455574-zsqdp\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.492929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjt4\" (UniqueName: \"kubernetes.io/projected/2e97ae33-5cee-47b5-98dd-7d3979437654-kube-api-access-sqjt4\") pod \"controller-manager-7566f66c88-9jjkw\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.525592 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.536041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.880996 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd" path="/var/lib/kubelet/pods/034c61a1-0f6e-4f4e-ac4b-21a6a32f8bdd/volumes" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.881954 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a750393-b0c7-48d1-99c8-c9c26ec13d02" path="/var/lib/kubelet/pods/0a750393-b0c7-48d1-99c8-c9c26ec13d02/volumes" Jan 28 18:39:02 crc kubenswrapper[4749]: I0128 18:39:02.981969 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp"] Jan 28 18:39:03 crc kubenswrapper[4749]: W0128 18:39:02.991184 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8de8560_b540_4a79_a828_77ab59f06f95.slice/crio-6de7dce80a8cbc4407d931f855a1aa662ea228a21d5fa97eb6b2a33bafc8d3a6 WatchSource:0}: Error finding container 6de7dce80a8cbc4407d931f855a1aa662ea228a21d5fa97eb6b2a33bafc8d3a6: Status 404 returned error can't find the container with id 6de7dce80a8cbc4407d931f855a1aa662ea228a21d5fa97eb6b2a33bafc8d3a6 Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.069677 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7566f66c88-9jjkw"] Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.255988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" event={"ID":"a8de8560-b540-4a79-a828-77ab59f06f95","Type":"ContainerStarted","Data":"f26e037b1cd83145b16ae917a5fd487d514785cae1f3a13298e5f08e4d80faa0"} Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.256348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" event={"ID":"a8de8560-b540-4a79-a828-77ab59f06f95","Type":"ContainerStarted","Data":"6de7dce80a8cbc4407d931f855a1aa662ea228a21d5fa97eb6b2a33bafc8d3a6"} Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.257452 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.259067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" event={"ID":"2e97ae33-5cee-47b5-98dd-7d3979437654","Type":"ContainerStarted","Data":"48b754b29c3bc7275137b4cecd1546ee584a0ac513db8697081789d1e360dfe1"} Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.259097 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" event={"ID":"2e97ae33-5cee-47b5-98dd-7d3979437654","Type":"ContainerStarted","Data":"4aa88bd3069fd2c1b07692d1684490888548749129f7eda1847a0da206e9d0c9"} Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.259544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.261205 4749 patch_prober.go:28] interesting pod/controller-manager-7566f66c88-9jjkw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.261245 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" podUID="2e97ae33-5cee-47b5-98dd-7d3979437654" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.275378 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" podStartSLOduration=3.275362447 podStartE2EDuration="3.275362447s" podCreationTimestamp="2026-01-28 18:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:39:03.273878961 +0000 UTC m=+211.285405756" watchObservedRunningTime="2026-01-28 18:39:03.275362447 +0000 UTC m=+211.286889212" Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.738600 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:03 crc kubenswrapper[4749]: I0128 18:39:03.753317 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" podStartSLOduration=3.753296164 podStartE2EDuration="3.753296164s" podCreationTimestamp="2026-01-28 18:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:39:03.291502873 +0000 UTC m=+211.303029668" watchObservedRunningTime="2026-01-28 18:39:03.753296164 +0000 UTC m=+211.764822939" Jan 28 18:39:04 crc kubenswrapper[4749]: I0128 18:39:04.269660 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:06 crc kubenswrapper[4749]: I0128 18:39:06.585118 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 28 18:39:08 crc kubenswrapper[4749]: I0128 18:39:08.484141 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4ngh4"] Jan 28 18:39:08 crc kubenswrapper[4749]: I0128 18:39:08.484416 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4ngh4" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerName="registry-server" containerID="cri-o://63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad" gracePeriod=2 Jan 28 18:39:08 crc kubenswrapper[4749]: I0128 18:39:08.685359 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbrpd"] Jan 28 18:39:08 crc kubenswrapper[4749]: I0128 18:39:08.685933 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jbrpd" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerName="registry-server" containerID="cri-o://50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af" gracePeriod=2 Jan 28 18:39:08 crc kubenswrapper[4749]: I0128 18:39:08.956784 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.055757 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-utilities\") pod \"b76a4267-3557-4246-b3dc-84a610d9fbd4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.055807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-catalog-content\") pod \"b76a4267-3557-4246-b3dc-84a610d9fbd4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.055857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqxqx\" (UniqueName: \"kubernetes.io/projected/b76a4267-3557-4246-b3dc-84a610d9fbd4-kube-api-access-jqxqx\") pod \"b76a4267-3557-4246-b3dc-84a610d9fbd4\" (UID: \"b76a4267-3557-4246-b3dc-84a610d9fbd4\") " Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.057646 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-utilities" (OuterVolumeSpecName: "utilities") pod "b76a4267-3557-4246-b3dc-84a610d9fbd4" (UID: "b76a4267-3557-4246-b3dc-84a610d9fbd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.061527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76a4267-3557-4246-b3dc-84a610d9fbd4-kube-api-access-jqxqx" (OuterVolumeSpecName: "kube-api-access-jqxqx") pod "b76a4267-3557-4246-b3dc-84a610d9fbd4" (UID: "b76a4267-3557-4246-b3dc-84a610d9fbd4"). InnerVolumeSpecName "kube-api-access-jqxqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.106640 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.107602 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b76a4267-3557-4246-b3dc-84a610d9fbd4" (UID: "b76a4267-3557-4246-b3dc-84a610d9fbd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.157291 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqxqx\" (UniqueName: \"kubernetes.io/projected/b76a4267-3557-4246-b3dc-84a610d9fbd4-kube-api-access-jqxqx\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.157350 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.157360 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76a4267-3557-4246-b3dc-84a610d9fbd4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.259446 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-catalog-content\") pod \"c03b812d-6833-4c65-887b-0fa0a6c1227a\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.259566 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-utilities\") pod \"c03b812d-6833-4c65-887b-0fa0a6c1227a\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.259654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9fdk\" (UniqueName: \"kubernetes.io/projected/c03b812d-6833-4c65-887b-0fa0a6c1227a-kube-api-access-n9fdk\") pod \"c03b812d-6833-4c65-887b-0fa0a6c1227a\" (UID: \"c03b812d-6833-4c65-887b-0fa0a6c1227a\") " Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.260268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-utilities" (OuterVolumeSpecName: "utilities") pod "c03b812d-6833-4c65-887b-0fa0a6c1227a" (UID: "c03b812d-6833-4c65-887b-0fa0a6c1227a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.262825 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03b812d-6833-4c65-887b-0fa0a6c1227a-kube-api-access-n9fdk" (OuterVolumeSpecName: "kube-api-access-n9fdk") pod "c03b812d-6833-4c65-887b-0fa0a6c1227a" (UID: "c03b812d-6833-4c65-887b-0fa0a6c1227a"). InnerVolumeSpecName "kube-api-access-n9fdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.288550 4749 generic.go:334] "Generic (PLEG): container finished" podID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerID="63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad" exitCode=0 Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.288638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ngh4" event={"ID":"b76a4267-3557-4246-b3dc-84a610d9fbd4","Type":"ContainerDied","Data":"63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad"} Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.288669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ngh4" event={"ID":"b76a4267-3557-4246-b3dc-84a610d9fbd4","Type":"ContainerDied","Data":"c400e76b96fe5d4ae74e4b919c2f2055142c6506f9ffb1475edb2af085e0b5b8"} Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.288688 4749 scope.go:117] "RemoveContainer" containerID="63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.288755 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4ngh4" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.291255 4749 generic.go:334] "Generic (PLEG): container finished" podID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerID="50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af" exitCode=0 Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.291304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbrpd" event={"ID":"c03b812d-6833-4c65-887b-0fa0a6c1227a","Type":"ContainerDied","Data":"50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af"} Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.291361 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbrpd" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.291324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbrpd" event={"ID":"c03b812d-6833-4c65-887b-0fa0a6c1227a","Type":"ContainerDied","Data":"c2676b1bdc8e15da808e143afad06ea893a551e5b5eb8ba2bc4accbd785238d0"} Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.302487 4749 scope.go:117] "RemoveContainer" containerID="5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.308405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c03b812d-6833-4c65-887b-0fa0a6c1227a" (UID: "c03b812d-6833-4c65-887b-0fa0a6c1227a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.324046 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4ngh4"] Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.325997 4749 scope.go:117] "RemoveContainer" containerID="975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.327719 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4ngh4"] Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.337636 4749 scope.go:117] "RemoveContainer" containerID="63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad" Jan 28 18:39:09 crc kubenswrapper[4749]: E0128 18:39:09.338251 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad\": container with ID starting with 63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad not found: ID does not exist" containerID="63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.338312 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad"} err="failed to get container status \"63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad\": rpc error: code = NotFound desc = could not find container \"63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad\": container with ID starting with 63060013b2589f38c3b8df8e651a7e61bde8a2a76204cdcdc5e0b4e111ceb6ad not found: ID does not exist" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.338368 4749 scope.go:117] "RemoveContainer" containerID="5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f" Jan 28 18:39:09 crc kubenswrapper[4749]: E0128 18:39:09.338797 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f\": container with ID starting with 5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f not found: ID does not exist" containerID="5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.338843 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f"} err="failed to get container status \"5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f\": rpc error: code = NotFound desc = could not find container \"5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f\": container with ID starting with 5ce276f0963799d2043dff468b79062dcaa8bd9534a36e0b553c640a0344049f not found: ID does not exist" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.338870 4749 scope.go:117] "RemoveContainer" containerID="975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4" Jan 28 18:39:09 crc kubenswrapper[4749]: E0128 18:39:09.339162 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4\": container with ID starting with 975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4 not found: ID does not exist" containerID="975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.339204 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4"} err="failed to get container status \"975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4\": rpc error: code = NotFound desc = could not find container \"975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4\": container with ID starting with 975220a291eae24b61949cb28d7d1eedc9b54b92b846e559bcd974524729d4f4 not found: ID does not exist" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.339228 4749 scope.go:117] "RemoveContainer" containerID="50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.356990 4749 scope.go:117] "RemoveContainer" containerID="b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.361497 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9fdk\" (UniqueName: \"kubernetes.io/projected/c03b812d-6833-4c65-887b-0fa0a6c1227a-kube-api-access-n9fdk\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.361526 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.361536 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03b812d-6833-4c65-887b-0fa0a6c1227a-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.372232 4749 scope.go:117] "RemoveContainer" containerID="d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.386512 4749 scope.go:117] "RemoveContainer" containerID="50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af" Jan 28 18:39:09 crc kubenswrapper[4749]: E0128 18:39:09.386929 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af\": container with ID starting with 50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af not found: ID does not exist" containerID="50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.386964 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af"} err="failed to get container status \"50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af\": rpc error: code = NotFound desc = could not find container \"50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af\": container with ID starting with 50f592ea2d3651efa0aa3e30e2f1dcd0f74dae21e933a2f4d65d7c4b528f04af not found: ID does not exist" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.387019 4749 scope.go:117] "RemoveContainer" containerID="b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593" Jan 28 18:39:09 crc kubenswrapper[4749]: E0128 18:39:09.389736 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593\": container with ID starting with b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593 not found: ID does not exist" containerID="b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.389769 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593"} err="failed to get container status \"b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593\": rpc error: code = NotFound desc = could not find container \"b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593\": container with ID starting with b26fb3aaf022b1d30e4e924b19cd9432e45e7498ac822620b9ad809e02374593 not found: ID does not exist" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.389790 4749 scope.go:117] "RemoveContainer" containerID="d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7" Jan 28 18:39:09 crc kubenswrapper[4749]: E0128 18:39:09.390836 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7\": container with ID starting with d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7 not found: ID does not exist" containerID="d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.390865 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7"} err="failed to get container status \"d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7\": rpc error: code = NotFound desc = could not find container \"d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7\": container with ID starting with d8d1fb074dde115781de2f3370b34efde54c40e521b7d4c2c03acf6b1d7ceca7 not found: ID does not exist" Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.638955 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbrpd"] Jan 28 18:39:09 crc kubenswrapper[4749]: I0128 18:39:09.644533 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jbrpd"] Jan 28 18:39:10 crc kubenswrapper[4749]: I0128 18:39:10.877294 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" path="/var/lib/kubelet/pods/b76a4267-3557-4246-b3dc-84a610d9fbd4/volumes" Jan 28 18:39:10 crc kubenswrapper[4749]: I0128 18:39:10.877934 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" path="/var/lib/kubelet/pods/c03b812d-6833-4c65-887b-0fa0a6c1227a/volumes" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.083919 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tndcr"] Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.084198 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tndcr" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="registry-server" containerID="cri-o://9d37f94c2dc698e57f6a3495021472c11f87b6465c6bb6af16208d4abc76c63a" gracePeriod=2 Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.284945 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25wll"] Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.285519 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-25wll" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="registry-server" containerID="cri-o://82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42" gracePeriod=2 Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.312970 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerID="9d37f94c2dc698e57f6a3495021472c11f87b6465c6bb6af16208d4abc76c63a" exitCode=0 Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.313008 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tndcr" event={"ID":"9ec3b90d-1483-485e-ab0a-52af455cc9ea","Type":"ContainerDied","Data":"9d37f94c2dc698e57f6a3495021472c11f87b6465c6bb6af16208d4abc76c63a"} Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.549533 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.689942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-catalog-content\") pod \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.690033 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzcsm\" (UniqueName: \"kubernetes.io/projected/9ec3b90d-1483-485e-ab0a-52af455cc9ea-kube-api-access-zzcsm\") pod \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.690175 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-utilities\") pod \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\" (UID: \"9ec3b90d-1483-485e-ab0a-52af455cc9ea\") " Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.691358 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-utilities" (OuterVolumeSpecName: "utilities") pod "9ec3b90d-1483-485e-ab0a-52af455cc9ea" (UID: "9ec3b90d-1483-485e-ab0a-52af455cc9ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.695837 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec3b90d-1483-485e-ab0a-52af455cc9ea-kube-api-access-zzcsm" (OuterVolumeSpecName: "kube-api-access-zzcsm") pod "9ec3b90d-1483-485e-ab0a-52af455cc9ea" (UID: "9ec3b90d-1483-485e-ab0a-52af455cc9ea"). InnerVolumeSpecName "kube-api-access-zzcsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.713890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ec3b90d-1483-485e-ab0a-52af455cc9ea" (UID: "9ec3b90d-1483-485e-ab0a-52af455cc9ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.774056 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.792019 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.792115 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzcsm\" (UniqueName: \"kubernetes.io/projected/9ec3b90d-1483-485e-ab0a-52af455cc9ea-kube-api-access-zzcsm\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.792133 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ec3b90d-1483-485e-ab0a-52af455cc9ea-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.892899 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-catalog-content\") pod \"3dcb2d5f-2613-4277-bba9-89e404c7832a\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.893401 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-utilities\") pod \"3dcb2d5f-2613-4277-bba9-89e404c7832a\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.893501 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl9kb\" (UniqueName: \"kubernetes.io/projected/3dcb2d5f-2613-4277-bba9-89e404c7832a-kube-api-access-zl9kb\") pod \"3dcb2d5f-2613-4277-bba9-89e404c7832a\" (UID: \"3dcb2d5f-2613-4277-bba9-89e404c7832a\") " Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.894277 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-utilities" (OuterVolumeSpecName: "utilities") pod "3dcb2d5f-2613-4277-bba9-89e404c7832a" (UID: "3dcb2d5f-2613-4277-bba9-89e404c7832a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.897490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dcb2d5f-2613-4277-bba9-89e404c7832a-kube-api-access-zl9kb" (OuterVolumeSpecName: "kube-api-access-zl9kb") pod "3dcb2d5f-2613-4277-bba9-89e404c7832a" (UID: "3dcb2d5f-2613-4277-bba9-89e404c7832a"). InnerVolumeSpecName "kube-api-access-zl9kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.994615 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:11 crc kubenswrapper[4749]: I0128 18:39:11.994647 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl9kb\" (UniqueName: \"kubernetes.io/projected/3dcb2d5f-2613-4277-bba9-89e404c7832a-kube-api-access-zl9kb\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.010416 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3dcb2d5f-2613-4277-bba9-89e404c7832a" (UID: "3dcb2d5f-2613-4277-bba9-89e404c7832a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.095629 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcb2d5f-2613-4277-bba9-89e404c7832a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.321039 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tndcr" event={"ID":"9ec3b90d-1483-485e-ab0a-52af455cc9ea","Type":"ContainerDied","Data":"85e49479d2193e3ee625d0350eebd144f266d1c27341a4c40cfe299e93828aee"} Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.321054 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tndcr" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.321097 4749 scope.go:117] "RemoveContainer" containerID="9d37f94c2dc698e57f6a3495021472c11f87b6465c6bb6af16208d4abc76c63a" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.323885 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerID="82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42" exitCode=0 Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.323928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25wll" event={"ID":"3dcb2d5f-2613-4277-bba9-89e404c7832a","Type":"ContainerDied","Data":"82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42"} Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.323958 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25wll" event={"ID":"3dcb2d5f-2613-4277-bba9-89e404c7832a","Type":"ContainerDied","Data":"1abd0ad8e9f50c9e1339a0c45af704bfdcc1c9cf70b8e7707af08322876c469e"} Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.324020 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25wll" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.336600 4749 scope.go:117] "RemoveContainer" containerID="5d6a6f9b707d6baef2f8a4063cc8975e39135845eb3d6c7b5863d2297d751b2b" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.355212 4749 scope.go:117] "RemoveContainer" containerID="e1c4945f86f71a02ad519381dadfaecc9c829d3847a629eeca06e89d2ae803de" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.371262 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25wll"] Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.381562 4749 scope.go:117] "RemoveContainer" containerID="82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.384969 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-25wll"] Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.399082 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tndcr"] Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.401436 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tndcr"] Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.402465 4749 scope.go:117] "RemoveContainer" containerID="a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.420996 4749 scope.go:117] "RemoveContainer" containerID="7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.456923 4749 scope.go:117] "RemoveContainer" containerID="82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42" Jan 28 18:39:12 crc kubenswrapper[4749]: E0128 18:39:12.460286 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42\": container with ID starting with 82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42 not found: ID does not exist" containerID="82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.460360 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42"} err="failed to get container status \"82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42\": rpc error: code = NotFound desc = could not find container \"82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42\": container with ID starting with 82d6b5653dc000a530e182a27383374c1247e15e88fbd74a1b98f3e894d06b42 not found: ID does not exist" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.460389 4749 scope.go:117] "RemoveContainer" containerID="a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c" Jan 28 18:39:12 crc kubenswrapper[4749]: E0128 18:39:12.463520 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c\": container with ID starting with a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c not found: ID does not exist" containerID="a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.463576 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c"} err="failed to get container status \"a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c\": rpc error: code = NotFound desc = could not find container \"a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c\": container with ID starting with a354de0a614217ded0e8dcfeead4e61805f902be34a7fab437ad11008ae9b67c not found: ID does not exist" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.463610 4749 scope.go:117] "RemoveContainer" containerID="7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a" Jan 28 18:39:12 crc kubenswrapper[4749]: E0128 18:39:12.466501 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a\": container with ID starting with 7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a not found: ID does not exist" containerID="7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.466529 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a"} err="failed to get container status \"7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a\": rpc error: code = NotFound desc = could not find container \"7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a\": container with ID starting with 7a04e8595b923d77471d4d0eeab460919da339b34a2788c46a7dbfe12045477a not found: ID does not exist" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.877158 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" path="/var/lib/kubelet/pods/3dcb2d5f-2613-4277-bba9-89e404c7832a/volumes" Jan 28 18:39:12 crc kubenswrapper[4749]: I0128 18:39:12.878112 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" path="/var/lib/kubelet/pods/9ec3b90d-1483-485e-ab0a-52af455cc9ea/volumes" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.188502 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ldb7k"] Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189373 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerName="extract-content" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189390 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerName="extract-content" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189406 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189413 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189421 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerName="extract-utilities" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189429 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerName="extract-utilities" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189442 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="extract-utilities" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189450 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="extract-utilities" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189461 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="extract-utilities" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189468 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="extract-utilities" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189480 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerName="extract-utilities" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189487 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerName="extract-utilities" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189495 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189585 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189598 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="extract-content" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189606 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="extract-content" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189621 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="extract-content" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189629 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="extract-content" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189638 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189645 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189657 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerName="extract-content" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189665 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerName="extract-content" Jan 28 18:39:18 crc kubenswrapper[4749]: E0128 18:39:18.189676 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189683 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189790 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03b812d-6833-4c65-887b-0fa0a6c1227a" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189805 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcb2d5f-2613-4277-bba9-89e404c7832a" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189815 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76a4267-3557-4246-b3dc-84a610d9fbd4" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.189828 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec3b90d-1483-485e-ab0a-52af455cc9ea" containerName="registry-server" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.190300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.213619 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ldb7k"] Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.373891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-registry-tls\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.373973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-registry-certificates\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.373994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvjm5\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-kube-api-access-bvjm5\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.374029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.374157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-trusted-ca\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.374225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.374396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-bound-sa-token\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.374429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.398933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.475653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.475726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-bound-sa-token\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.475745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.475766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-registry-tls\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.475811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-registry-certificates\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.475834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvjm5\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-kube-api-access-bvjm5\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.475871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-trusted-ca\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.476225 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.482679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-trusted-ca\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.485835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.487622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-registry-tls\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.494956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-registry-certificates\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.510415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-bound-sa-token\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.512206 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvjm5\" (UniqueName: \"kubernetes.io/projected/6ce66796-91d0-48b0-9c8d-f479ee3ff0fa-kube-api-access-bvjm5\") pod \"image-registry-66df7c8f76-ldb7k\" (UID: \"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:18 crc kubenswrapper[4749]: I0128 18:39:18.807476 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:19 crc kubenswrapper[4749]: I0128 18:39:19.276073 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ldb7k"] Jan 28 18:39:19 crc kubenswrapper[4749]: W0128 18:39:19.283890 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce66796_91d0_48b0_9c8d_f479ee3ff0fa.slice/crio-6f0fcb0b485765f6f9a368047aaa717c2a6722640b1cfbe443defb3950acaf18 WatchSource:0}: Error finding container 6f0fcb0b485765f6f9a368047aaa717c2a6722640b1cfbe443defb3950acaf18: Status 404 returned error can't find the container with id 6f0fcb0b485765f6f9a368047aaa717c2a6722640b1cfbe443defb3950acaf18 Jan 28 18:39:19 crc kubenswrapper[4749]: I0128 18:39:19.362689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" event={"ID":"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa","Type":"ContainerStarted","Data":"6f0fcb0b485765f6f9a368047aaa717c2a6722640b1cfbe443defb3950acaf18"} Jan 28 18:39:20 crc kubenswrapper[4749]: I0128 18:39:20.369193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" event={"ID":"6ce66796-91d0-48b0-9c8d-f479ee3ff0fa","Type":"ContainerStarted","Data":"b7f3ba43662c793974b2577fcdc6a0b38eacb96190261e1baca6b9df83a0e7ed"} Jan 28 18:39:20 crc kubenswrapper[4749]: I0128 18:39:20.370487 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:20 crc kubenswrapper[4749]: I0128 18:39:20.388376 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" podStartSLOduration=2.388360549 podStartE2EDuration="2.388360549s" podCreationTimestamp="2026-01-28 18:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:39:20.386241638 +0000 UTC m=+228.397768423" watchObservedRunningTime="2026-01-28 18:39:20.388360549 +0000 UTC m=+228.399887324" Jan 28 18:39:20 crc kubenswrapper[4749]: I0128 18:39:20.532060 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp"] Jan 28 18:39:20 crc kubenswrapper[4749]: I0128 18:39:20.532302 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" podUID="a8de8560-b540-4a79-a828-77ab59f06f95" containerName="route-controller-manager" containerID="cri-o://f26e037b1cd83145b16ae917a5fd487d514785cae1f3a13298e5f08e4d80faa0" gracePeriod=30 Jan 28 18:39:20 crc kubenswrapper[4749]: I0128 18:39:20.540057 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7566f66c88-9jjkw"] Jan 28 18:39:20 crc kubenswrapper[4749]: I0128 18:39:20.540303 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" podUID="2e97ae33-5cee-47b5-98dd-7d3979437654" containerName="controller-manager" containerID="cri-o://48b754b29c3bc7275137b4cecd1546ee584a0ac513db8697081789d1e360dfe1" gracePeriod=30 Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.375710 4749 generic.go:334] "Generic (PLEG): container finished" podID="a8de8560-b540-4a79-a828-77ab59f06f95" containerID="f26e037b1cd83145b16ae917a5fd487d514785cae1f3a13298e5f08e4d80faa0" exitCode=0 Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.375835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" event={"ID":"a8de8560-b540-4a79-a828-77ab59f06f95","Type":"ContainerDied","Data":"f26e037b1cd83145b16ae917a5fd487d514785cae1f3a13298e5f08e4d80faa0"} Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.378839 4749 generic.go:334] "Generic (PLEG): container finished" podID="2e97ae33-5cee-47b5-98dd-7d3979437654" containerID="48b754b29c3bc7275137b4cecd1546ee584a0ac513db8697081789d1e360dfe1" exitCode=0 Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.378912 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" event={"ID":"2e97ae33-5cee-47b5-98dd-7d3979437654","Type":"ContainerDied","Data":"48b754b29c3bc7275137b4cecd1546ee584a0ac513db8697081789d1e360dfe1"} Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.554453 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.722014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8de8560-b540-4a79-a828-77ab59f06f95-serving-cert\") pod \"a8de8560-b540-4a79-a828-77ab59f06f95\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.722068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-config\") pod \"a8de8560-b540-4a79-a828-77ab59f06f95\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.722117 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dhcn\" (UniqueName: \"kubernetes.io/projected/a8de8560-b540-4a79-a828-77ab59f06f95-kube-api-access-6dhcn\") pod \"a8de8560-b540-4a79-a828-77ab59f06f95\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.722968 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8de8560-b540-4a79-a828-77ab59f06f95" (UID: "a8de8560-b540-4a79-a828-77ab59f06f95"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.723138 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-config" (OuterVolumeSpecName: "config") pod "a8de8560-b540-4a79-a828-77ab59f06f95" (UID: "a8de8560-b540-4a79-a828-77ab59f06f95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.723170 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-client-ca\") pod \"a8de8560-b540-4a79-a828-77ab59f06f95\" (UID: \"a8de8560-b540-4a79-a828-77ab59f06f95\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.723448 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.723458 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8de8560-b540-4a79-a828-77ab59f06f95-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.727873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8de8560-b540-4a79-a828-77ab59f06f95-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8de8560-b540-4a79-a828-77ab59f06f95" (UID: "a8de8560-b540-4a79-a828-77ab59f06f95"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.728124 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8de8560-b540-4a79-a828-77ab59f06f95-kube-api-access-6dhcn" (OuterVolumeSpecName: "kube-api-access-6dhcn") pod "a8de8560-b540-4a79-a828-77ab59f06f95" (UID: "a8de8560-b540-4a79-a828-77ab59f06f95"). InnerVolumeSpecName "kube-api-access-6dhcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.804150 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.824891 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8de8560-b540-4a79-a828-77ab59f06f95-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.824931 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dhcn\" (UniqueName: \"kubernetes.io/projected/a8de8560-b540-4a79-a828-77ab59f06f95-kube-api-access-6dhcn\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.925841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-proxy-ca-bundles\") pod \"2e97ae33-5cee-47b5-98dd-7d3979437654\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.926548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e97ae33-5cee-47b5-98dd-7d3979437654-serving-cert\") pod \"2e97ae33-5cee-47b5-98dd-7d3979437654\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.926593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-config\") pod \"2e97ae33-5cee-47b5-98dd-7d3979437654\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.926616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2e97ae33-5cee-47b5-98dd-7d3979437654" (UID: "2e97ae33-5cee-47b5-98dd-7d3979437654"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.926642 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjt4\" (UniqueName: \"kubernetes.io/projected/2e97ae33-5cee-47b5-98dd-7d3979437654-kube-api-access-sqjt4\") pod \"2e97ae33-5cee-47b5-98dd-7d3979437654\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.926657 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-client-ca\") pod \"2e97ae33-5cee-47b5-98dd-7d3979437654\" (UID: \"2e97ae33-5cee-47b5-98dd-7d3979437654\") " Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.927069 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.927546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e97ae33-5cee-47b5-98dd-7d3979437654" (UID: "2e97ae33-5cee-47b5-98dd-7d3979437654"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.927594 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-config" (OuterVolumeSpecName: "config") pod "2e97ae33-5cee-47b5-98dd-7d3979437654" (UID: "2e97ae33-5cee-47b5-98dd-7d3979437654"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.929223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e97ae33-5cee-47b5-98dd-7d3979437654-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e97ae33-5cee-47b5-98dd-7d3979437654" (UID: "2e97ae33-5cee-47b5-98dd-7d3979437654"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:39:21 crc kubenswrapper[4749]: I0128 18:39:21.930469 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e97ae33-5cee-47b5-98dd-7d3979437654-kube-api-access-sqjt4" (OuterVolumeSpecName: "kube-api-access-sqjt4") pod "2e97ae33-5cee-47b5-98dd-7d3979437654" (UID: "2e97ae33-5cee-47b5-98dd-7d3979437654"). InnerVolumeSpecName "kube-api-access-sqjt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.028329 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e97ae33-5cee-47b5-98dd-7d3979437654-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.028389 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.028398 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e97ae33-5cee-47b5-98dd-7d3979437654-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.028408 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjt4\" (UniqueName: \"kubernetes.io/projected/2e97ae33-5cee-47b5-98dd-7d3979437654-kube-api-access-sqjt4\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.213528 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx"] Jan 28 18:39:22 crc kubenswrapper[4749]: E0128 18:39:22.214268 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e97ae33-5cee-47b5-98dd-7d3979437654" containerName="controller-manager" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.214424 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e97ae33-5cee-47b5-98dd-7d3979437654" containerName="controller-manager" Jan 28 18:39:22 crc kubenswrapper[4749]: E0128 18:39:22.214529 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8de8560-b540-4a79-a828-77ab59f06f95" containerName="route-controller-manager" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.214604 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8de8560-b540-4a79-a828-77ab59f06f95" containerName="route-controller-manager" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.214892 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8de8560-b540-4a79-a828-77ab59f06f95" containerName="route-controller-manager" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.214981 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e97ae33-5cee-47b5-98dd-7d3979437654" containerName="controller-manager" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.215560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.224694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx"] Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.332295 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59fhg\" (UniqueName: \"kubernetes.io/projected/ed74dcac-b479-4a6c-9304-05eac99fe38b-kube-api-access-59fhg\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.332495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed74dcac-b479-4a6c-9304-05eac99fe38b-serving-cert\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.332724 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed74dcac-b479-4a6c-9304-05eac99fe38b-config\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.332802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed74dcac-b479-4a6c-9304-05eac99fe38b-client-ca\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.385507 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" event={"ID":"a8de8560-b540-4a79-a828-77ab59f06f95","Type":"ContainerDied","Data":"6de7dce80a8cbc4407d931f855a1aa662ea228a21d5fa97eb6b2a33bafc8d3a6"} Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.385586 4749 scope.go:117] "RemoveContainer" containerID="f26e037b1cd83145b16ae917a5fd487d514785cae1f3a13298e5f08e4d80faa0" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.385515 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.387178 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.387173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7566f66c88-9jjkw" event={"ID":"2e97ae33-5cee-47b5-98dd-7d3979437654","Type":"ContainerDied","Data":"4aa88bd3069fd2c1b07692d1684490888548749129f7eda1847a0da206e9d0c9"} Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.404049 4749 scope.go:117] "RemoveContainer" containerID="48b754b29c3bc7275137b4cecd1546ee584a0ac513db8697081789d1e360dfe1" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.424307 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp"] Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.428357 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-665b455574-zsqdp"] Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.434560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed74dcac-b479-4a6c-9304-05eac99fe38b-config\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.434840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed74dcac-b479-4a6c-9304-05eac99fe38b-client-ca\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.435016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59fhg\" (UniqueName: \"kubernetes.io/projected/ed74dcac-b479-4a6c-9304-05eac99fe38b-kube-api-access-59fhg\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.435170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed74dcac-b479-4a6c-9304-05eac99fe38b-serving-cert\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.435799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed74dcac-b479-4a6c-9304-05eac99fe38b-client-ca\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.436119 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7566f66c88-9jjkw"] Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.436694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed74dcac-b479-4a6c-9304-05eac99fe38b-config\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.439311 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed74dcac-b479-4a6c-9304-05eac99fe38b-serving-cert\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.439963 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7566f66c88-9jjkw"] Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.453562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59fhg\" (UniqueName: \"kubernetes.io/projected/ed74dcac-b479-4a6c-9304-05eac99fe38b-kube-api-access-59fhg\") pod \"route-controller-manager-6669c9bf4-hhzcx\" (UID: \"ed74dcac-b479-4a6c-9304-05eac99fe38b\") " pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.527493 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.878860 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e97ae33-5cee-47b5-98dd-7d3979437654" path="/var/lib/kubelet/pods/2e97ae33-5cee-47b5-98dd-7d3979437654/volumes" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.879775 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8de8560-b540-4a79-a828-77ab59f06f95" path="/var/lib/kubelet/pods/a8de8560-b540-4a79-a828-77ab59f06f95/volumes" Jan 28 18:39:22 crc kubenswrapper[4749]: I0128 18:39:22.985759 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx"] Jan 28 18:39:22 crc kubenswrapper[4749]: W0128 18:39:22.994275 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded74dcac_b479_4a6c_9304_05eac99fe38b.slice/crio-8580301fd3f26373661dddfa2442dae3fe3a25cf3f8d3bb453caef1154679d23 WatchSource:0}: Error finding container 8580301fd3f26373661dddfa2442dae3fe3a25cf3f8d3bb453caef1154679d23: Status 404 returned error can't find the container with id 8580301fd3f26373661dddfa2442dae3fe3a25cf3f8d3bb453caef1154679d23 Jan 28 18:39:23 crc kubenswrapper[4749]: I0128 18:39:23.393586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" event={"ID":"ed74dcac-b479-4a6c-9304-05eac99fe38b","Type":"ContainerStarted","Data":"6d462b44be8af6346ef81d02d099db33effa0008f021e461bac05a94ec2fdf97"} Jan 28 18:39:23 crc kubenswrapper[4749]: I0128 18:39:23.393639 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:23 crc kubenswrapper[4749]: I0128 18:39:23.393650 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" event={"ID":"ed74dcac-b479-4a6c-9304-05eac99fe38b","Type":"ContainerStarted","Data":"8580301fd3f26373661dddfa2442dae3fe3a25cf3f8d3bb453caef1154679d23"} Jan 28 18:39:23 crc kubenswrapper[4749]: I0128 18:39:23.432257 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" podStartSLOduration=3.432236142 podStartE2EDuration="3.432236142s" podCreationTimestamp="2026-01-28 18:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:39:23.430364387 +0000 UTC m=+231.441891182" watchObservedRunningTime="2026-01-28 18:39:23.432236142 +0000 UTC m=+231.443762937" Jan 28 18:39:23 crc kubenswrapper[4749]: I0128 18:39:23.492558 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6669c9bf4-hhzcx" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.215042 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-847nd"] Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.215733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.218879 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.218956 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.220628 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.220904 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.221286 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.221534 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.228223 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-847nd"] Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.228303 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.364102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-proxy-ca-bundles\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.364183 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-config\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.364233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46ebac68-c568-4362-9c8d-bc68e361e3a6-serving-cert\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.364336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7wvn\" (UniqueName: \"kubernetes.io/projected/46ebac68-c568-4362-9c8d-bc68e361e3a6-kube-api-access-k7wvn\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.364458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-client-ca\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.466184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-proxy-ca-bundles\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.466260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-config\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.466344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46ebac68-c568-4362-9c8d-bc68e361e3a6-serving-cert\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.466457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7wvn\" (UniqueName: \"kubernetes.io/projected/46ebac68-c568-4362-9c8d-bc68e361e3a6-kube-api-access-k7wvn\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.466500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-client-ca\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.467418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-proxy-ca-bundles\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.468010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-config\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.468896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46ebac68-c568-4362-9c8d-bc68e361e3a6-client-ca\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.479953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46ebac68-c568-4362-9c8d-bc68e361e3a6-serving-cert\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.487966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7wvn\" (UniqueName: \"kubernetes.io/projected/46ebac68-c568-4362-9c8d-bc68e361e3a6-kube-api-access-k7wvn\") pod \"controller-manager-79558b8d74-847nd\" (UID: \"46ebac68-c568-4362-9c8d-bc68e361e3a6\") " pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.532345 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:24 crc kubenswrapper[4749]: I0128 18:39:24.943251 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79558b8d74-847nd"] Jan 28 18:39:24 crc kubenswrapper[4749]: W0128 18:39:24.952167 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ebac68_c568_4362_9c8d_bc68e361e3a6.slice/crio-e0f0b6df167e8e9c472dbdd9c8b706512fe0c287a8c696a9cad851f7f0f2b2a6 WatchSource:0}: Error finding container e0f0b6df167e8e9c472dbdd9c8b706512fe0c287a8c696a9cad851f7f0f2b2a6: Status 404 returned error can't find the container with id e0f0b6df167e8e9c472dbdd9c8b706512fe0c287a8c696a9cad851f7f0f2b2a6 Jan 28 18:39:25 crc kubenswrapper[4749]: I0128 18:39:25.425066 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" event={"ID":"46ebac68-c568-4362-9c8d-bc68e361e3a6","Type":"ContainerStarted","Data":"130fd48519871f37bc769a31175f54cdf8547017d30b475148479e071e544f27"} Jan 28 18:39:25 crc kubenswrapper[4749]: I0128 18:39:25.425539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" event={"ID":"46ebac68-c568-4362-9c8d-bc68e361e3a6","Type":"ContainerStarted","Data":"e0f0b6df167e8e9c472dbdd9c8b706512fe0c287a8c696a9cad851f7f0f2b2a6"} Jan 28 18:39:25 crc kubenswrapper[4749]: I0128 18:39:25.443784 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" podStartSLOduration=5.443766156 podStartE2EDuration="5.443766156s" podCreationTimestamp="2026-01-28 18:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:39:25.442605568 +0000 UTC m=+233.454132353" watchObservedRunningTime="2026-01-28 18:39:25.443766156 +0000 UTC m=+233.455292931" Jan 28 18:39:26 crc kubenswrapper[4749]: I0128 18:39:26.429677 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:26 crc kubenswrapper[4749]: I0128 18:39:26.433689 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79558b8d74-847nd" Jan 28 18:39:38 crc kubenswrapper[4749]: I0128 18:39:38.815233 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ldb7k" Jan 28 18:39:38 crc kubenswrapper[4749]: I0128 18:39:38.891779 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5hd"] Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.119943 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cn95r"] Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.120163 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cn95r" podUID="01642193-d926-44c5-908a-98716476032b" containerName="registry-server" containerID="cri-o://bb8374f38945f151cdc5a4371456d1f24dc95f8ddee532dd2a1d88f16b432560" gracePeriod=30 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.129744 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhv8w"] Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.129980 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rhv8w" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerName="registry-server" containerID="cri-o://776f8e0882a32def36f1f68153ebfae7c3a09c93279505d95a0a53492c539a9c" gracePeriod=30 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.140043 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z74vl"] Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.140251 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" containerID="cri-o://8c06768f68a39cdc21d01da138ac9a991041ff068b507d6672f5d6e48354a206" gracePeriod=30 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.151381 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-whzp4"] Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.151634 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-whzp4" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="registry-server" containerID="cri-o://f1b91af5baceab106fd6e75209fa95e07c1de01ae7622838db2b1c30dbde2520" gracePeriod=30 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.160074 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jzj9j"] Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.160926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.169543 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mk2w9"] Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.169761 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mk2w9" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="registry-server" containerID="cri-o://5aaae7b195ab94a2bcf3385a48673994535ef87fd799f749e01fb22a8ac0b0b6" gracePeriod=30 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.213492 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jzj9j"] Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.256346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/714ea987-0827-4494-9bee-eff5f2bb07b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.256418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx9t5\" (UniqueName: \"kubernetes.io/projected/714ea987-0827-4494-9bee-eff5f2bb07b2-kube-api-access-dx9t5\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.256757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/714ea987-0827-4494-9bee-eff5f2bb07b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.357839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/714ea987-0827-4494-9bee-eff5f2bb07b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.357885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/714ea987-0827-4494-9bee-eff5f2bb07b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.357909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx9t5\" (UniqueName: \"kubernetes.io/projected/714ea987-0827-4494-9bee-eff5f2bb07b2-kube-api-access-dx9t5\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.359443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/714ea987-0827-4494-9bee-eff5f2bb07b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.365440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/714ea987-0827-4494-9bee-eff5f2bb07b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.372514 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx9t5\" (UniqueName: \"kubernetes.io/projected/714ea987-0827-4494-9bee-eff5f2bb07b2-kube-api-access-dx9t5\") pod \"marketplace-operator-79b997595-jzj9j\" (UID: \"714ea987-0827-4494-9bee-eff5f2bb07b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.483406 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.501123 4749 generic.go:334] "Generic (PLEG): container finished" podID="d0092231-157d-498a-a811-c533dccee8ce" containerID="8c06768f68a39cdc21d01da138ac9a991041ff068b507d6672f5d6e48354a206" exitCode=0 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.501196 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" event={"ID":"d0092231-157d-498a-a811-c533dccee8ce","Type":"ContainerDied","Data":"8c06768f68a39cdc21d01da138ac9a991041ff068b507d6672f5d6e48354a206"} Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.501239 4749 scope.go:117] "RemoveContainer" containerID="bb5be9913a3a37f304a7e73e831a5e4c4930b47ee3d0f71da8c7e5d162241abe" Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.503390 4749 generic.go:334] "Generic (PLEG): container finished" podID="5239a0f8-12de-4979-af6c-d209a21bc067" containerID="f1b91af5baceab106fd6e75209fa95e07c1de01ae7622838db2b1c30dbde2520" exitCode=0 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.503427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whzp4" event={"ID":"5239a0f8-12de-4979-af6c-d209a21bc067","Type":"ContainerDied","Data":"f1b91af5baceab106fd6e75209fa95e07c1de01ae7622838db2b1c30dbde2520"} Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.506396 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerID="776f8e0882a32def36f1f68153ebfae7c3a09c93279505d95a0a53492c539a9c" exitCode=0 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.506438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhv8w" event={"ID":"ffc6bd30-5803-4a01-a711-4f3b3c718750","Type":"ContainerDied","Data":"776f8e0882a32def36f1f68153ebfae7c3a09c93279505d95a0a53492c539a9c"} Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.508363 4749 generic.go:334] "Generic (PLEG): container finished" podID="01642193-d926-44c5-908a-98716476032b" containerID="bb8374f38945f151cdc5a4371456d1f24dc95f8ddee532dd2a1d88f16b432560" exitCode=0 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.508409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn95r" event={"ID":"01642193-d926-44c5-908a-98716476032b","Type":"ContainerDied","Data":"bb8374f38945f151cdc5a4371456d1f24dc95f8ddee532dd2a1d88f16b432560"} Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.509855 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerID="5aaae7b195ab94a2bcf3385a48673994535ef87fd799f749e01fb22a8ac0b0b6" exitCode=0 Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.509876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2w9" event={"ID":"ebc65c41-25f8-4ee9-9993-a00101a35397","Type":"ContainerDied","Data":"5aaae7b195ab94a2bcf3385a48673994535ef87fd799f749e01fb22a8ac0b0b6"} Jan 28 18:39:39 crc kubenswrapper[4749]: I0128 18:39:39.879398 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jzj9j"] Jan 28 18:39:39 crc kubenswrapper[4749]: W0128 18:39:39.891145 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod714ea987_0827_4494_9bee_eff5f2bb07b2.slice/crio-4c61291e23fd6f86144f17ac823c8e360375a28c04ba71247e975308e67265d7 WatchSource:0}: Error finding container 4c61291e23fd6f86144f17ac823c8e360375a28c04ba71247e975308e67265d7: Status 404 returned error can't find the container with id 4c61291e23fd6f86144f17ac823c8e360375a28c04ba71247e975308e67265d7 Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.450844 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.517225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" event={"ID":"714ea987-0827-4494-9bee-eff5f2bb07b2","Type":"ContainerStarted","Data":"f67b0c3e5edb5d8e7413c1c5a3f838301e40b45f0f83815a24e4d1c96e0b9719"} Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.517275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" event={"ID":"714ea987-0827-4494-9bee-eff5f2bb07b2","Type":"ContainerStarted","Data":"4c61291e23fd6f86144f17ac823c8e360375a28c04ba71247e975308e67265d7"} Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.527571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhv8w" event={"ID":"ffc6bd30-5803-4a01-a711-4f3b3c718750","Type":"ContainerDied","Data":"6190ed0d15e035b8f99b7d42404e79e9db8be2be771d0fb977123110d969174b"} Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.527612 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6190ed0d15e035b8f99b7d42404e79e9db8be2be771d0fb977123110d969174b" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.529734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn95r" event={"ID":"01642193-d926-44c5-908a-98716476032b","Type":"ContainerDied","Data":"473270e3aa4e346f2ad4b6fd84b7d97b5864bba706628d79cf5e273f0ec0ac7b"} Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.529772 4749 scope.go:117] "RemoveContainer" containerID="bb8374f38945f151cdc5a4371456d1f24dc95f8ddee532dd2a1d88f16b432560" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.529879 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cn95r" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.542371 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.543670 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.556658 4749 scope.go:117] "RemoveContainer" containerID="ff94875b24f1eb34cb791a00e03c8e05e0f25c0e0fab94524514adf241bb16d0" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.573093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-utilities\") pod \"01642193-d926-44c5-908a-98716476032b\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.573422 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn59z\" (UniqueName: \"kubernetes.io/projected/01642193-d926-44c5-908a-98716476032b-kube-api-access-hn59z\") pod \"01642193-d926-44c5-908a-98716476032b\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.573488 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-catalog-content\") pod \"01642193-d926-44c5-908a-98716476032b\" (UID: \"01642193-d926-44c5-908a-98716476032b\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.574015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-utilities" (OuterVolumeSpecName: "utilities") pod "01642193-d926-44c5-908a-98716476032b" (UID: "01642193-d926-44c5-908a-98716476032b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.579833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01642193-d926-44c5-908a-98716476032b-kube-api-access-hn59z" (OuterVolumeSpecName: "kube-api-access-hn59z") pod "01642193-d926-44c5-908a-98716476032b" (UID: "01642193-d926-44c5-908a-98716476032b"). InnerVolumeSpecName "kube-api-access-hn59z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.581710 4749 scope.go:117] "RemoveContainer" containerID="f24936f5a1c0c6d4f9193af6249231de12f825866b02fde2b77557e4a585f37b" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.629627 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01642193-d926-44c5-908a-98716476032b" (UID: "01642193-d926-44c5-908a-98716476032b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.674694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsqlx\" (UniqueName: \"kubernetes.io/projected/5239a0f8-12de-4979-af6c-d209a21bc067-kube-api-access-vsqlx\") pod \"5239a0f8-12de-4979-af6c-d209a21bc067\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.675730 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2dl2\" (UniqueName: \"kubernetes.io/projected/ffc6bd30-5803-4a01-a711-4f3b3c718750-kube-api-access-z2dl2\") pod \"ffc6bd30-5803-4a01-a711-4f3b3c718750\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.676172 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-utilities\") pod \"ffc6bd30-5803-4a01-a711-4f3b3c718750\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.676455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-catalog-content\") pod \"ffc6bd30-5803-4a01-a711-4f3b3c718750\" (UID: \"ffc6bd30-5803-4a01-a711-4f3b3c718750\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.676694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-utilities\") pod \"5239a0f8-12de-4979-af6c-d209a21bc067\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.676817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-catalog-content\") pod \"5239a0f8-12de-4979-af6c-d209a21bc067\" (UID: \"5239a0f8-12de-4979-af6c-d209a21bc067\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.677632 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.677707 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01642193-d926-44c5-908a-98716476032b-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.677721 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn59z\" (UniqueName: \"kubernetes.io/projected/01642193-d926-44c5-908a-98716476032b-kube-api-access-hn59z\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.677744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-utilities" (OuterVolumeSpecName: "utilities") pod "ffc6bd30-5803-4a01-a711-4f3b3c718750" (UID: "ffc6bd30-5803-4a01-a711-4f3b3c718750"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.678530 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-utilities" (OuterVolumeSpecName: "utilities") pod "5239a0f8-12de-4979-af6c-d209a21bc067" (UID: "5239a0f8-12de-4979-af6c-d209a21bc067"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.692353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc6bd30-5803-4a01-a711-4f3b3c718750-kube-api-access-z2dl2" (OuterVolumeSpecName: "kube-api-access-z2dl2") pod "ffc6bd30-5803-4a01-a711-4f3b3c718750" (UID: "ffc6bd30-5803-4a01-a711-4f3b3c718750"). InnerVolumeSpecName "kube-api-access-z2dl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.696056 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5239a0f8-12de-4979-af6c-d209a21bc067-kube-api-access-vsqlx" (OuterVolumeSpecName: "kube-api-access-vsqlx") pod "5239a0f8-12de-4979-af6c-d209a21bc067" (UID: "5239a0f8-12de-4979-af6c-d209a21bc067"). InnerVolumeSpecName "kube-api-access-vsqlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.712762 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5239a0f8-12de-4979-af6c-d209a21bc067" (UID: "5239a0f8-12de-4979-af6c-d209a21bc067"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.739767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffc6bd30-5803-4a01-a711-4f3b3c718750" (UID: "ffc6bd30-5803-4a01-a711-4f3b3c718750"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.778798 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsqlx\" (UniqueName: \"kubernetes.io/projected/5239a0f8-12de-4979-af6c-d209a21bc067-kube-api-access-vsqlx\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.778852 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2dl2\" (UniqueName: \"kubernetes.io/projected/ffc6bd30-5803-4a01-a711-4f3b3c718750-kube-api-access-z2dl2\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.778865 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.778876 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc6bd30-5803-4a01-a711-4f3b3c718750-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.778906 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.778917 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5239a0f8-12de-4979-af6c-d209a21bc067-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.795743 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.866055 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cn95r"] Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.879176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0092231-157d-498a-a811-c533dccee8ce-marketplace-operator-metrics\") pod \"d0092231-157d-498a-a811-c533dccee8ce\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.879310 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0092231-157d-498a-a811-c533dccee8ce-marketplace-trusted-ca\") pod \"d0092231-157d-498a-a811-c533dccee8ce\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.879389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8mc6\" (UniqueName: \"kubernetes.io/projected/d0092231-157d-498a-a811-c533dccee8ce-kube-api-access-v8mc6\") pod \"d0092231-157d-498a-a811-c533dccee8ce\" (UID: \"d0092231-157d-498a-a811-c533dccee8ce\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.880035 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0092231-157d-498a-a811-c533dccee8ce-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d0092231-157d-498a-a811-c533dccee8ce" (UID: "d0092231-157d-498a-a811-c533dccee8ce"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.880716 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.881165 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cn95r"] Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.883722 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0092231-157d-498a-a811-c533dccee8ce-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d0092231-157d-498a-a811-c533dccee8ce" (UID: "d0092231-157d-498a-a811-c533dccee8ce"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.884214 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0092231-157d-498a-a811-c533dccee8ce-kube-api-access-v8mc6" (OuterVolumeSpecName: "kube-api-access-v8mc6") pod "d0092231-157d-498a-a811-c533dccee8ce" (UID: "d0092231-157d-498a-a811-c533dccee8ce"). InnerVolumeSpecName "kube-api-access-v8mc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.980199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-catalog-content\") pod \"ebc65c41-25f8-4ee9-9993-a00101a35397\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.980475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-utilities\") pod \"ebc65c41-25f8-4ee9-9993-a00101a35397\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.981292 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfktk\" (UniqueName: \"kubernetes.io/projected/ebc65c41-25f8-4ee9-9993-a00101a35397-kube-api-access-tfktk\") pod \"ebc65c41-25f8-4ee9-9993-a00101a35397\" (UID: \"ebc65c41-25f8-4ee9-9993-a00101a35397\") " Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.981209 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-utilities" (OuterVolumeSpecName: "utilities") pod "ebc65c41-25f8-4ee9-9993-a00101a35397" (UID: "ebc65c41-25f8-4ee9-9993-a00101a35397"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.982148 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.982258 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0092231-157d-498a-a811-c533dccee8ce-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.982370 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8mc6\" (UniqueName: \"kubernetes.io/projected/d0092231-157d-498a-a811-c533dccee8ce-kube-api-access-v8mc6\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.982443 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0092231-157d-498a-a811-c533dccee8ce-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:40 crc kubenswrapper[4749]: I0128 18:39:40.983576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebc65c41-25f8-4ee9-9993-a00101a35397-kube-api-access-tfktk" (OuterVolumeSpecName: "kube-api-access-tfktk") pod "ebc65c41-25f8-4ee9-9993-a00101a35397" (UID: "ebc65c41-25f8-4ee9-9993-a00101a35397"). InnerVolumeSpecName "kube-api-access-tfktk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.084137 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfktk\" (UniqueName: \"kubernetes.io/projected/ebc65c41-25f8-4ee9-9993-a00101a35397-kube-api-access-tfktk\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.101553 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebc65c41-25f8-4ee9-9993-a00101a35397" (UID: "ebc65c41-25f8-4ee9-9993-a00101a35397"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.186021 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebc65c41-25f8-4ee9-9993-a00101a35397-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.540183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2w9" event={"ID":"ebc65c41-25f8-4ee9-9993-a00101a35397","Type":"ContainerDied","Data":"cb96efea370f5b5e5b94e69d2e334c99fa82345b0b8f9bf202826805f0418416"} Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.540241 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mk2w9" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.540273 4749 scope.go:117] "RemoveContainer" containerID="5aaae7b195ab94a2bcf3385a48673994535ef87fd799f749e01fb22a8ac0b0b6" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.541749 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mq9lg"] Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.541979 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerName="extract-content" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.541996 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerName="extract-content" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542010 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542018 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542031 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="extract-content" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542039 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="extract-content" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542047 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01642193-d926-44c5-908a-98716476032b" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542055 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01642193-d926-44c5-908a-98716476032b" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542065 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="extract-utilities" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542074 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="extract-utilities" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542086 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="extract-content" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542095 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="extract-content" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542109 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01642193-d926-44c5-908a-98716476032b" containerName="extract-utilities" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542116 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01642193-d926-44c5-908a-98716476032b" containerName="extract-utilities" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542126 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542134 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542148 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="extract-utilities" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542156 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="extract-utilities" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542168 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542175 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542184 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerName="extract-utilities" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542193 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerName="extract-utilities" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542203 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01642193-d926-44c5-908a-98716476032b" containerName="extract-content" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542212 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01642193-d926-44c5-908a-98716476032b" containerName="extract-content" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542221 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542229 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542368 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542384 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542394 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01642193-d926-44c5-908a-98716476032b" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542408 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542416 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" containerName="registry-server" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542425 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" Jan 28 18:39:41 crc kubenswrapper[4749]: E0128 18:39:41.542544 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.542555 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0092231-157d-498a-a811-c533dccee8ce" containerName="marketplace-operator" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.543369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.545322 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whzp4" event={"ID":"5239a0f8-12de-4979-af6c-d209a21bc067","Type":"ContainerDied","Data":"32176ac25e3067a7ea471a8c548fa31d069a3cf8cdc0a6b54e129d70db92b5a1"} Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.545371 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whzp4" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.546867 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.554014 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.554054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z74vl" event={"ID":"d0092231-157d-498a-a811-c533dccee8ce","Type":"ContainerDied","Data":"f8ec94190709a5971dc7c17fbda6d64e08a4b6dd2d0828032572f49f24a52073"} Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.554152 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhv8w" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.554398 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.557081 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.557459 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mq9lg"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.572715 4749 scope.go:117] "RemoveContainer" containerID="b79ff7c2c69acb044f8bf97cfaa44ff922f20495b4d885ab05a85c8d54786f3c" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.588477 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-whzp4"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.593545 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-whzp4"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.601341 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jzj9j" podStartSLOduration=2.601302251 podStartE2EDuration="2.601302251s" podCreationTimestamp="2026-01-28 18:39:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:39:41.597072939 +0000 UTC m=+249.608599734" watchObservedRunningTime="2026-01-28 18:39:41.601302251 +0000 UTC m=+249.612829046" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.605656 4749 scope.go:117] "RemoveContainer" containerID="1b6c37df7f9cd31fb18e60ed1e6daaef397ab7ecc7f319663f0e1a140e440455" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.619970 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhv8w"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.624011 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rhv8w"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.630309 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mk2w9"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.633157 4749 scope.go:117] "RemoveContainer" containerID="f1b91af5baceab106fd6e75209fa95e07c1de01ae7622838db2b1c30dbde2520" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.634036 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mk2w9"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.645687 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z74vl"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.653221 4749 scope.go:117] "RemoveContainer" containerID="4e140e7e41fbcd5b78bfd83fc4f3acf53a4239095e7ace2dc5db60d92c073291" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.654670 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z74vl"] Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.671095 4749 scope.go:117] "RemoveContainer" containerID="479b9e045041ee86eb5ea6ecd7cc41be3b19fa92143b3e2181fe434bb6d360a7" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.686134 4749 scope.go:117] "RemoveContainer" containerID="8c06768f68a39cdc21d01da138ac9a991041ff068b507d6672f5d6e48354a206" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.693939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxpqc\" (UniqueName: \"kubernetes.io/projected/8e5a0c35-40e6-424a-9a03-377de71895bb-kube-api-access-zxpqc\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.694062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-catalog-content\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.694430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-utilities\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.795812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxpqc\" (UniqueName: \"kubernetes.io/projected/8e5a0c35-40e6-424a-9a03-377de71895bb-kube-api-access-zxpqc\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.795874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-catalog-content\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.795916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-utilities\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.796402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-utilities\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.796520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-catalog-content\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.811631 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxpqc\" (UniqueName: \"kubernetes.io/projected/8e5a0c35-40e6-424a-9a03-377de71895bb-kube-api-access-zxpqc\") pod \"certified-operators-mq9lg\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:41 crc kubenswrapper[4749]: I0128 18:39:41.876571 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:42 crc kubenswrapper[4749]: I0128 18:39:42.284048 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mq9lg"] Jan 28 18:39:42 crc kubenswrapper[4749]: I0128 18:39:42.559795 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9lg" event={"ID":"8e5a0c35-40e6-424a-9a03-377de71895bb","Type":"ContainerStarted","Data":"a4f9b43a08c7e16f68f6f435b0bb11976e2408c6897b123eb1145bd90e4b0735"} Jan 28 18:39:42 crc kubenswrapper[4749]: I0128 18:39:42.879997 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01642193-d926-44c5-908a-98716476032b" path="/var/lib/kubelet/pods/01642193-d926-44c5-908a-98716476032b/volumes" Jan 28 18:39:42 crc kubenswrapper[4749]: I0128 18:39:42.881201 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5239a0f8-12de-4979-af6c-d209a21bc067" path="/var/lib/kubelet/pods/5239a0f8-12de-4979-af6c-d209a21bc067/volumes" Jan 28 18:39:42 crc kubenswrapper[4749]: I0128 18:39:42.884007 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0092231-157d-498a-a811-c533dccee8ce" path="/var/lib/kubelet/pods/d0092231-157d-498a-a811-c533dccee8ce/volumes" Jan 28 18:39:42 crc kubenswrapper[4749]: I0128 18:39:42.885500 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebc65c41-25f8-4ee9-9993-a00101a35397" path="/var/lib/kubelet/pods/ebc65c41-25f8-4ee9-9993-a00101a35397/volumes" Jan 28 18:39:42 crc kubenswrapper[4749]: I0128 18:39:42.886454 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc6bd30-5803-4a01-a711-4f3b3c718750" path="/var/lib/kubelet/pods/ffc6bd30-5803-4a01-a711-4f3b3c718750/volumes" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.336221 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8b6q2"] Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.337414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.340099 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.350447 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8b6q2"] Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.414856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e7736f-045a-424e-92f9-5ac197561ff0-utilities\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.414923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntwg\" (UniqueName: \"kubernetes.io/projected/12e7736f-045a-424e-92f9-5ac197561ff0-kube-api-access-hntwg\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.414963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e7736f-045a-424e-92f9-5ac197561ff0-catalog-content\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.516465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e7736f-045a-424e-92f9-5ac197561ff0-utilities\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.516544 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hntwg\" (UniqueName: \"kubernetes.io/projected/12e7736f-045a-424e-92f9-5ac197561ff0-kube-api-access-hntwg\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.516582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e7736f-045a-424e-92f9-5ac197561ff0-catalog-content\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.517028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e7736f-045a-424e-92f9-5ac197561ff0-utilities\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.517044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e7736f-045a-424e-92f9-5ac197561ff0-catalog-content\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.537317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntwg\" (UniqueName: \"kubernetes.io/projected/12e7736f-045a-424e-92f9-5ac197561ff0-kube-api-access-hntwg\") pod \"community-operators-8b6q2\" (UID: \"12e7736f-045a-424e-92f9-5ac197561ff0\") " pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.569410 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerID="76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02" exitCode=0 Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.569514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9lg" event={"ID":"8e5a0c35-40e6-424a-9a03-377de71895bb","Type":"ContainerDied","Data":"76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02"} Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.655557 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.938229 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vhkfd"] Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.940209 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.942123 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 18:39:43 crc kubenswrapper[4749]: I0128 18:39:43.956504 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhkfd"] Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.023988 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpbzr\" (UniqueName: \"kubernetes.io/projected/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-kube-api-access-cpbzr\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.024036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-utilities\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.024077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-catalog-content\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.039101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8b6q2"] Jan 28 18:39:44 crc kubenswrapper[4749]: W0128 18:39:44.048646 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e7736f_045a_424e_92f9_5ac197561ff0.slice/crio-6d1fa7c77b111f3e3fa63ec32977233d3255ba278c79f4bd1223ea5f5838020f WatchSource:0}: Error finding container 6d1fa7c77b111f3e3fa63ec32977233d3255ba278c79f4bd1223ea5f5838020f: Status 404 returned error can't find the container with id 6d1fa7c77b111f3e3fa63ec32977233d3255ba278c79f4bd1223ea5f5838020f Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.125107 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpbzr\" (UniqueName: \"kubernetes.io/projected/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-kube-api-access-cpbzr\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.125267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-utilities\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.125355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-catalog-content\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.125667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-utilities\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.125693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-catalog-content\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.145453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpbzr\" (UniqueName: \"kubernetes.io/projected/b8bbe45a-affd-4b0f-81e1-dcf1db6b321a-kube-api-access-cpbzr\") pod \"redhat-operators-vhkfd\" (UID: \"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a\") " pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.270639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.575282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8b6q2" event={"ID":"12e7736f-045a-424e-92f9-5ac197561ff0","Type":"ContainerStarted","Data":"b4a5f3fda86e8f5c36a4bfc4e62e5687cf5e310e10cb02c35a37a52153d7a76c"} Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.575326 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8b6q2" event={"ID":"12e7736f-045a-424e-92f9-5ac197561ff0","Type":"ContainerStarted","Data":"6d1fa7c77b111f3e3fa63ec32977233d3255ba278c79f4bd1223ea5f5838020f"} Jan 28 18:39:44 crc kubenswrapper[4749]: I0128 18:39:44.671978 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhkfd"] Jan 28 18:39:44 crc kubenswrapper[4749]: W0128 18:39:44.681636 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bbe45a_affd_4b0f_81e1_dcf1db6b321a.slice/crio-c057c754f8f5f79016b9034b4d936de30e63644e01a9c1916a6f5ce465ac2f2b WatchSource:0}: Error finding container c057c754f8f5f79016b9034b4d936de30e63644e01a9c1916a6f5ce465ac2f2b: Status 404 returned error can't find the container with id c057c754f8f5f79016b9034b4d936de30e63644e01a9c1916a6f5ce465ac2f2b Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.581363 4749 generic.go:334] "Generic (PLEG): container finished" podID="12e7736f-045a-424e-92f9-5ac197561ff0" containerID="b4a5f3fda86e8f5c36a4bfc4e62e5687cf5e310e10cb02c35a37a52153d7a76c" exitCode=0 Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.583246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8b6q2" event={"ID":"12e7736f-045a-424e-92f9-5ac197561ff0","Type":"ContainerDied","Data":"b4a5f3fda86e8f5c36a4bfc4e62e5687cf5e310e10cb02c35a37a52153d7a76c"} Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.584205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhkfd" event={"ID":"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a","Type":"ContainerStarted","Data":"347c4132b403bf3898fa1d0045ea450cb5b90a7c688b0a95d52a729345f34495"} Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.584289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhkfd" event={"ID":"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a","Type":"ContainerStarted","Data":"c057c754f8f5f79016b9034b4d936de30e63644e01a9c1916a6f5ce465ac2f2b"} Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.739835 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xlwht"] Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.741434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.744906 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.751209 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlwht"] Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.858818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118103e4-87d2-4431-bd0a-17fddfbbc497-catalog-content\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.858890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlppd\" (UniqueName: \"kubernetes.io/projected/118103e4-87d2-4431-bd0a-17fddfbbc497-kube-api-access-vlppd\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.858985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118103e4-87d2-4431-bd0a-17fddfbbc497-utilities\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.960172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118103e4-87d2-4431-bd0a-17fddfbbc497-utilities\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.960255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118103e4-87d2-4431-bd0a-17fddfbbc497-catalog-content\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.960302 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlppd\" (UniqueName: \"kubernetes.io/projected/118103e4-87d2-4431-bd0a-17fddfbbc497-kube-api-access-vlppd\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.960640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118103e4-87d2-4431-bd0a-17fddfbbc497-utilities\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.960710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118103e4-87d2-4431-bd0a-17fddfbbc497-catalog-content\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:45 crc kubenswrapper[4749]: I0128 18:39:45.978567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlppd\" (UniqueName: \"kubernetes.io/projected/118103e4-87d2-4431-bd0a-17fddfbbc497-kube-api-access-vlppd\") pod \"redhat-marketplace-xlwht\" (UID: \"118103e4-87d2-4431-bd0a-17fddfbbc497\") " pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:46 crc kubenswrapper[4749]: I0128 18:39:46.124286 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:46 crc kubenswrapper[4749]: I0128 18:39:46.519073 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlwht"] Jan 28 18:39:46 crc kubenswrapper[4749]: I0128 18:39:46.588908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlwht" event={"ID":"118103e4-87d2-4431-bd0a-17fddfbbc497","Type":"ContainerStarted","Data":"2e60c776ea0a1d8c87783214ec908584ecf2aac13d016a1460058467fb42879e"} Jan 28 18:39:46 crc kubenswrapper[4749]: I0128 18:39:46.590531 4749 generic.go:334] "Generic (PLEG): container finished" podID="b8bbe45a-affd-4b0f-81e1-dcf1db6b321a" containerID="347c4132b403bf3898fa1d0045ea450cb5b90a7c688b0a95d52a729345f34495" exitCode=0 Jan 28 18:39:46 crc kubenswrapper[4749]: I0128 18:39:46.590604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhkfd" event={"ID":"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a","Type":"ContainerDied","Data":"347c4132b403bf3898fa1d0045ea450cb5b90a7c688b0a95d52a729345f34495"} Jan 28 18:39:48 crc kubenswrapper[4749]: I0128 18:39:48.621237 4749 generic.go:334] "Generic (PLEG): container finished" podID="118103e4-87d2-4431-bd0a-17fddfbbc497" containerID="2e6e481e47f2526f580606f7b2c149e1a326ee604ed0445b2ee5fb3600f36984" exitCode=0 Jan 28 18:39:48 crc kubenswrapper[4749]: I0128 18:39:48.621586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlwht" event={"ID":"118103e4-87d2-4431-bd0a-17fddfbbc497","Type":"ContainerDied","Data":"2e6e481e47f2526f580606f7b2c149e1a326ee604ed0445b2ee5fb3600f36984"} Jan 28 18:39:48 crc kubenswrapper[4749]: I0128 18:39:48.625470 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerID="f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827" exitCode=0 Jan 28 18:39:48 crc kubenswrapper[4749]: I0128 18:39:48.625517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9lg" event={"ID":"8e5a0c35-40e6-424a-9a03-377de71895bb","Type":"ContainerDied","Data":"f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827"} Jan 28 18:39:50 crc kubenswrapper[4749]: I0128 18:39:50.636393 4749 generic.go:334] "Generic (PLEG): container finished" podID="12e7736f-045a-424e-92f9-5ac197561ff0" containerID="6eb533a4080d8f0711fc9cf56a3583dc80d60b4ff3c81c889338f470b187502b" exitCode=0 Jan 28 18:39:50 crc kubenswrapper[4749]: I0128 18:39:50.636500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8b6q2" event={"ID":"12e7736f-045a-424e-92f9-5ac197561ff0","Type":"ContainerDied","Data":"6eb533a4080d8f0711fc9cf56a3583dc80d60b4ff3c81c889338f470b187502b"} Jan 28 18:39:50 crc kubenswrapper[4749]: I0128 18:39:50.638445 4749 generic.go:334] "Generic (PLEG): container finished" podID="b8bbe45a-affd-4b0f-81e1-dcf1db6b321a" containerID="108c865fe76d226e1e74b943ec2bdddc64e6cc9633ab402d283d5e7772b51639" exitCode=0 Jan 28 18:39:50 crc kubenswrapper[4749]: I0128 18:39:50.638496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhkfd" event={"ID":"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a","Type":"ContainerDied","Data":"108c865fe76d226e1e74b943ec2bdddc64e6cc9633ab402d283d5e7772b51639"} Jan 28 18:39:51 crc kubenswrapper[4749]: I0128 18:39:51.645916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9lg" event={"ID":"8e5a0c35-40e6-424a-9a03-377de71895bb","Type":"ContainerStarted","Data":"8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b"} Jan 28 18:39:51 crc kubenswrapper[4749]: I0128 18:39:51.667524 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mq9lg" podStartSLOduration=3.351299557 podStartE2EDuration="10.667506004s" podCreationTimestamp="2026-01-28 18:39:41 +0000 UTC" firstStartedPulling="2026-01-28 18:39:43.572428305 +0000 UTC m=+251.583955080" lastFinishedPulling="2026-01-28 18:39:50.888634752 +0000 UTC m=+258.900161527" observedRunningTime="2026-01-28 18:39:51.666439399 +0000 UTC m=+259.677966194" watchObservedRunningTime="2026-01-28 18:39:51.667506004 +0000 UTC m=+259.679032779" Jan 28 18:39:51 crc kubenswrapper[4749]: I0128 18:39:51.877690 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:51 crc kubenswrapper[4749]: I0128 18:39:51.878063 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:39:52 crc kubenswrapper[4749]: I0128 18:39:52.916564 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mq9lg" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="registry-server" probeResult="failure" output=< Jan 28 18:39:52 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:39:52 crc kubenswrapper[4749]: > Jan 28 18:39:53 crc kubenswrapper[4749]: I0128 18:39:53.662942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhkfd" event={"ID":"b8bbe45a-affd-4b0f-81e1-dcf1db6b321a","Type":"ContainerStarted","Data":"577cb4dbf2408d170723df3ab3ec5f8ab9712dadba12fdb117f25ce84d41f989"} Jan 28 18:39:53 crc kubenswrapper[4749]: I0128 18:39:53.665033 4749 generic.go:334] "Generic (PLEG): container finished" podID="118103e4-87d2-4431-bd0a-17fddfbbc497" containerID="3c37fda97253cbffbd36b58e4c94aca380fdb9c202815447bcec2b1b2f579b43" exitCode=0 Jan 28 18:39:53 crc kubenswrapper[4749]: I0128 18:39:53.665854 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlwht" event={"ID":"118103e4-87d2-4431-bd0a-17fddfbbc497","Type":"ContainerDied","Data":"3c37fda97253cbffbd36b58e4c94aca380fdb9c202815447bcec2b1b2f579b43"} Jan 28 18:39:53 crc kubenswrapper[4749]: I0128 18:39:53.684402 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vhkfd" podStartSLOduration=4.809190516 podStartE2EDuration="10.684387487s" podCreationTimestamp="2026-01-28 18:39:43 +0000 UTC" firstStartedPulling="2026-01-28 18:39:46.607894966 +0000 UTC m=+254.619421741" lastFinishedPulling="2026-01-28 18:39:52.483091937 +0000 UTC m=+260.494618712" observedRunningTime="2026-01-28 18:39:53.678323201 +0000 UTC m=+261.689849996" watchObservedRunningTime="2026-01-28 18:39:53.684387487 +0000 UTC m=+261.695914252" Jan 28 18:39:54 crc kubenswrapper[4749]: I0128 18:39:54.271223 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:54 crc kubenswrapper[4749]: I0128 18:39:54.271512 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:39:55 crc kubenswrapper[4749]: I0128 18:39:55.309758 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vhkfd" podUID="b8bbe45a-affd-4b0f-81e1-dcf1db6b321a" containerName="registry-server" probeResult="failure" output=< Jan 28 18:39:55 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:39:55 crc kubenswrapper[4749]: > Jan 28 18:39:55 crc kubenswrapper[4749]: I0128 18:39:55.695313 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8b6q2" event={"ID":"12e7736f-045a-424e-92f9-5ac197561ff0","Type":"ContainerStarted","Data":"8dd9a79f10a3f3eec6b49f888a38d0c2c26c5769b25c33f9cc59f060563f96fb"} Jan 28 18:39:55 crc kubenswrapper[4749]: I0128 18:39:55.698685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlwht" event={"ID":"118103e4-87d2-4431-bd0a-17fddfbbc497","Type":"ContainerStarted","Data":"ea41642c6256db2e2f7e19d3855cdae1df2eb7432f32d6e322b0e611004ad01d"} Jan 28 18:39:55 crc kubenswrapper[4749]: I0128 18:39:55.716196 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8b6q2" podStartSLOduration=4.56889213 podStartE2EDuration="12.71617754s" podCreationTimestamp="2026-01-28 18:39:43 +0000 UTC" firstStartedPulling="2026-01-28 18:39:45.583029617 +0000 UTC m=+253.594556392" lastFinishedPulling="2026-01-28 18:39:53.730315037 +0000 UTC m=+261.741841802" observedRunningTime="2026-01-28 18:39:55.712297286 +0000 UTC m=+263.723824091" watchObservedRunningTime="2026-01-28 18:39:55.71617754 +0000 UTC m=+263.727704315" Jan 28 18:39:55 crc kubenswrapper[4749]: I0128 18:39:55.735742 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xlwht" podStartSLOduration=5.058941985 podStartE2EDuration="10.735725133s" podCreationTimestamp="2026-01-28 18:39:45 +0000 UTC" firstStartedPulling="2026-01-28 18:39:48.749321775 +0000 UTC m=+256.760848580" lastFinishedPulling="2026-01-28 18:39:54.426104953 +0000 UTC m=+262.437631728" observedRunningTime="2026-01-28 18:39:55.73067779 +0000 UTC m=+263.742204575" watchObservedRunningTime="2026-01-28 18:39:55.735725133 +0000 UTC m=+263.747251908" Jan 28 18:39:56 crc kubenswrapper[4749]: I0128 18:39:56.125535 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:56 crc kubenswrapper[4749]: I0128 18:39:56.125585 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:39:57 crc kubenswrapper[4749]: I0128 18:39:57.185005 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xlwht" podUID="118103e4-87d2-4431-bd0a-17fddfbbc497" containerName="registry-server" probeResult="failure" output=< Jan 28 18:39:57 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:39:57 crc kubenswrapper[4749]: > Jan 28 18:40:01 crc kubenswrapper[4749]: I0128 18:40:01.949818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:40:01 crc kubenswrapper[4749]: I0128 18:40:01.992241 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 18:40:03 crc kubenswrapper[4749]: I0128 18:40:03.656501 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:40:03 crc kubenswrapper[4749]: I0128 18:40:03.656579 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:40:03 crc kubenswrapper[4749]: I0128 18:40:03.697496 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:40:03 crc kubenswrapper[4749]: I0128 18:40:03.773489 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8b6q2" Jan 28 18:40:03 crc kubenswrapper[4749]: I0128 18:40:03.929171 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" podUID="a45db790-79fe-4928-b701-d64737024f60" containerName="registry" containerID="cri-o://4187963be65e0224661c02b7690c5bcb483dd4bdc6ffc084d2c7850ed17d534b" gracePeriod=30 Jan 28 18:40:04 crc kubenswrapper[4749]: I0128 18:40:04.316374 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:40:04 crc kubenswrapper[4749]: I0128 18:40:04.367307 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vhkfd" Jan 28 18:40:06 crc kubenswrapper[4749]: I0128 18:40:06.189191 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:40:06 crc kubenswrapper[4749]: I0128 18:40:06.236515 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xlwht" Jan 28 18:40:07 crc kubenswrapper[4749]: I0128 18:40:07.763928 4749 generic.go:334] "Generic (PLEG): container finished" podID="a45db790-79fe-4928-b701-d64737024f60" containerID="4187963be65e0224661c02b7690c5bcb483dd4bdc6ffc084d2c7850ed17d534b" exitCode=0 Jan 28 18:40:07 crc kubenswrapper[4749]: I0128 18:40:07.763987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" event={"ID":"a45db790-79fe-4928-b701-d64737024f60","Type":"ContainerDied","Data":"4187963be65e0224661c02b7690c5bcb483dd4bdc6ffc084d2c7850ed17d534b"} Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.665178 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.777413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" event={"ID":"a45db790-79fe-4928-b701-d64737024f60","Type":"ContainerDied","Data":"ccd294db1970901ef31362870de5455fcdf090ae9e6410a192c924efc2a34dea"} Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.777476 4749 scope.go:117] "RemoveContainer" containerID="4187963be65e0224661c02b7690c5bcb483dd4bdc6ffc084d2c7850ed17d534b" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.777490 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bt5hd" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.787548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-registry-certificates\") pod \"a45db790-79fe-4928-b701-d64737024f60\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.787745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a45db790-79fe-4928-b701-d64737024f60-ca-trust-extracted\") pod \"a45db790-79fe-4928-b701-d64737024f60\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.787816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a45db790-79fe-4928-b701-d64737024f60-installation-pull-secrets\") pod \"a45db790-79fe-4928-b701-d64737024f60\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.787930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-registry-tls\") pod \"a45db790-79fe-4928-b701-d64737024f60\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.788183 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a45db790-79fe-4928-b701-d64737024f60\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.788227 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sgwr\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-kube-api-access-2sgwr\") pod \"a45db790-79fe-4928-b701-d64737024f60\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.788253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-bound-sa-token\") pod \"a45db790-79fe-4928-b701-d64737024f60\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.788303 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-trusted-ca\") pod \"a45db790-79fe-4928-b701-d64737024f60\" (UID: \"a45db790-79fe-4928-b701-d64737024f60\") " Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.788309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a45db790-79fe-4928-b701-d64737024f60" (UID: "a45db790-79fe-4928-b701-d64737024f60"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.789040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a45db790-79fe-4928-b701-d64737024f60" (UID: "a45db790-79fe-4928-b701-d64737024f60"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.789833 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.789867 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a45db790-79fe-4928-b701-d64737024f60-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.802392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a45db790-79fe-4928-b701-d64737024f60" (UID: "a45db790-79fe-4928-b701-d64737024f60"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.802648 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a45db790-79fe-4928-b701-d64737024f60" (UID: "a45db790-79fe-4928-b701-d64737024f60"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.803080 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45db790-79fe-4928-b701-d64737024f60-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a45db790-79fe-4928-b701-d64737024f60" (UID: "a45db790-79fe-4928-b701-d64737024f60"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.804636 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-kube-api-access-2sgwr" (OuterVolumeSpecName: "kube-api-access-2sgwr") pod "a45db790-79fe-4928-b701-d64737024f60" (UID: "a45db790-79fe-4928-b701-d64737024f60"). InnerVolumeSpecName "kube-api-access-2sgwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.820269 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45db790-79fe-4928-b701-d64737024f60-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a45db790-79fe-4928-b701-d64737024f60" (UID: "a45db790-79fe-4928-b701-d64737024f60"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.821930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a45db790-79fe-4928-b701-d64737024f60" (UID: "a45db790-79fe-4928-b701-d64737024f60"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.890606 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.890634 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sgwr\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-kube-api-access-2sgwr\") on node \"crc\" DevicePath \"\"" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.890644 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a45db790-79fe-4928-b701-d64737024f60-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.890653 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a45db790-79fe-4928-b701-d64737024f60-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 28 18:40:08 crc kubenswrapper[4749]: I0128 18:40:08.890667 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a45db790-79fe-4928-b701-d64737024f60-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 28 18:40:09 crc kubenswrapper[4749]: I0128 18:40:09.140085 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5hd"] Jan 28 18:40:09 crc kubenswrapper[4749]: I0128 18:40:09.143793 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5hd"] Jan 28 18:40:10 crc kubenswrapper[4749]: I0128 18:40:10.879581 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45db790-79fe-4928-b701-d64737024f60" path="/var/lib/kubelet/pods/a45db790-79fe-4928-b701-d64737024f60/volumes" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.742873 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr"] Jan 28 18:40:12 crc kubenswrapper[4749]: E0128 18:40:12.743148 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45db790-79fe-4928-b701-d64737024f60" containerName="registry" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.743164 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45db790-79fe-4928-b701-d64737024f60" containerName="registry" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.743344 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45db790-79fe-4928-b701-d64737024f60" containerName="registry" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.743843 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.745919 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.746104 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.746409 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.747061 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.747800 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.759341 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr"] Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.843199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79r26\" (UniqueName: \"kubernetes.io/projected/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-kube-api-access-79r26\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.843255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.843395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.944617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79r26\" (UniqueName: \"kubernetes.io/projected/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-kube-api-access-79r26\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.944678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.944765 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.946248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.952472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:12 crc kubenswrapper[4749]: I0128 18:40:12.961790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79r26\" (UniqueName: \"kubernetes.io/projected/ae11ef65-a99e-43e6-a9e1-cbc132f031e3-kube-api-access-79r26\") pod \"cluster-monitoring-operator-6d5b84845-5mjzr\" (UID: \"ae11ef65-a99e-43e6-a9e1-cbc132f031e3\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:13 crc kubenswrapper[4749]: I0128 18:40:13.060348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" Jan 28 18:40:13 crc kubenswrapper[4749]: I0128 18:40:13.444632 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr"] Jan 28 18:40:13 crc kubenswrapper[4749]: W0128 18:40:13.452508 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae11ef65_a99e_43e6_a9e1_cbc132f031e3.slice/crio-7c34c3dc01fa7f69af043378cb6d78e2021149136c9c42f5bf176496cb223a07 WatchSource:0}: Error finding container 7c34c3dc01fa7f69af043378cb6d78e2021149136c9c42f5bf176496cb223a07: Status 404 returned error can't find the container with id 7c34c3dc01fa7f69af043378cb6d78e2021149136c9c42f5bf176496cb223a07 Jan 28 18:40:13 crc kubenswrapper[4749]: I0128 18:40:13.805470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" event={"ID":"ae11ef65-a99e-43e6-a9e1-cbc132f031e3","Type":"ContainerStarted","Data":"7c34c3dc01fa7f69af043378cb6d78e2021149136c9c42f5bf176496cb223a07"} Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.416667 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj"] Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.418393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.424460 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.424985 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-k9gts" Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.431253 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj"] Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.517942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/53f97abf-7aa8-4cb1-8ffd-2af56a683fcd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-krpfj\" (UID: \"53f97abf-7aa8-4cb1-8ffd-2af56a683fcd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.619288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/53f97abf-7aa8-4cb1-8ffd-2af56a683fcd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-krpfj\" (UID: \"53f97abf-7aa8-4cb1-8ffd-2af56a683fcd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.631383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/53f97abf-7aa8-4cb1-8ffd-2af56a683fcd-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-krpfj\" (UID: \"53f97abf-7aa8-4cb1-8ffd-2af56a683fcd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.739105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" Jan 28 18:40:18 crc kubenswrapper[4749]: I0128 18:40:18.839597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" event={"ID":"ae11ef65-a99e-43e6-a9e1-cbc132f031e3","Type":"ContainerStarted","Data":"9b0668f064537594fd5ec40f25c072120b1c0839d2bf4606237601b81941a17d"} Jan 28 18:40:21 crc kubenswrapper[4749]: I0128 18:40:21.398637 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-5mjzr" podStartSLOduration=5.338418099 podStartE2EDuration="9.398616449s" podCreationTimestamp="2026-01-28 18:40:12 +0000 UTC" firstStartedPulling="2026-01-28 18:40:13.454184654 +0000 UTC m=+281.465711439" lastFinishedPulling="2026-01-28 18:40:17.514383004 +0000 UTC m=+285.525909789" observedRunningTime="2026-01-28 18:40:18.859763455 +0000 UTC m=+286.871290260" watchObservedRunningTime="2026-01-28 18:40:21.398616449 +0000 UTC m=+289.410143234" Jan 28 18:40:21 crc kubenswrapper[4749]: I0128 18:40:21.399493 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj"] Jan 28 18:40:21 crc kubenswrapper[4749]: I0128 18:40:21.854524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" event={"ID":"53f97abf-7aa8-4cb1-8ffd-2af56a683fcd","Type":"ContainerStarted","Data":"dcb7592ca6237a7011514fc56a0292e393c68b807b91bdcf269756ac80cfff81"} Jan 28 18:40:23 crc kubenswrapper[4749]: I0128 18:40:23.867473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" event={"ID":"53f97abf-7aa8-4cb1-8ffd-2af56a683fcd","Type":"ContainerStarted","Data":"36ce7774abdc9e6bf56d859a77ae556d7f1b3ef58f068fdb6bfe1faa63fc447b"} Jan 28 18:40:23 crc kubenswrapper[4749]: I0128 18:40:23.868449 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" Jan 28 18:40:23 crc kubenswrapper[4749]: I0128 18:40:23.870531 4749 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-krpfj container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.76:8443/healthz\": dial tcp 10.217.0.76:8443: connect: connection refused" start-of-body= Jan 28 18:40:23 crc kubenswrapper[4749]: I0128 18:40:23.870585 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" podUID="53f97abf-7aa8-4cb1-8ffd-2af56a683fcd" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.76:8443/healthz\": dial tcp 10.217.0.76:8443: connect: connection refused" Jan 28 18:40:23 crc kubenswrapper[4749]: I0128 18:40:23.883237 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" podStartSLOduration=3.574895249 podStartE2EDuration="5.883218692s" podCreationTimestamp="2026-01-28 18:40:18 +0000 UTC" firstStartedPulling="2026-01-28 18:40:21.409589394 +0000 UTC m=+289.421116169" lastFinishedPulling="2026-01-28 18:40:23.717912837 +0000 UTC m=+291.729439612" observedRunningTime="2026-01-28 18:40:23.882852763 +0000 UTC m=+291.894379548" watchObservedRunningTime="2026-01-28 18:40:23.883218692 +0000 UTC m=+291.894745477" Jan 28 18:40:24 crc kubenswrapper[4749]: I0128 18:40:24.879508 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-krpfj" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.460838 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-m52fk"] Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.461812 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.463864 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.463881 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.464249 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-jhmn2" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.465208 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.472701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-m52fk"] Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.614361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.614446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89b92063-36b4-4def-849f-7fdc92e04308-metrics-client-ca\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.614474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.614500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lmz2\" (UniqueName: \"kubernetes.io/projected/89b92063-36b4-4def-849f-7fdc92e04308-kube-api-access-4lmz2\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.715540 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89b92063-36b4-4def-849f-7fdc92e04308-metrics-client-ca\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.715604 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.715642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lmz2\" (UniqueName: \"kubernetes.io/projected/89b92063-36b4-4def-849f-7fdc92e04308-kube-api-access-4lmz2\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.715701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: E0128 18:40:25.715848 4749 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Jan 28 18:40:25 crc kubenswrapper[4749]: E0128 18:40:25.715911 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-tls podName:89b92063-36b4-4def-849f-7fdc92e04308 nodeName:}" failed. No retries permitted until 2026-01-28 18:40:26.215894681 +0000 UTC m=+294.227421456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-tls") pod "prometheus-operator-db54df47d-m52fk" (UID: "89b92063-36b4-4def-849f-7fdc92e04308") : secret "prometheus-operator-tls" not found Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.716934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89b92063-36b4-4def-849f-7fdc92e04308-metrics-client-ca\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.721901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:25 crc kubenswrapper[4749]: I0128 18:40:25.732889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lmz2\" (UniqueName: \"kubernetes.io/projected/89b92063-36b4-4def-849f-7fdc92e04308-kube-api-access-4lmz2\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:26 crc kubenswrapper[4749]: I0128 18:40:26.222582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:26 crc kubenswrapper[4749]: I0128 18:40:26.227962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89b92063-36b4-4def-849f-7fdc92e04308-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-m52fk\" (UID: \"89b92063-36b4-4def-849f-7fdc92e04308\") " pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:26 crc kubenswrapper[4749]: I0128 18:40:26.389047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" Jan 28 18:40:26 crc kubenswrapper[4749]: I0128 18:40:26.768226 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-m52fk"] Jan 28 18:40:26 crc kubenswrapper[4749]: I0128 18:40:26.883163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" event={"ID":"89b92063-36b4-4def-849f-7fdc92e04308","Type":"ContainerStarted","Data":"0a5e7f9b19aa907db0f8dc0333590758839bd1614d2926b4bf98d010080a68c3"} Jan 28 18:40:28 crc kubenswrapper[4749]: I0128 18:40:28.895530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" event={"ID":"89b92063-36b4-4def-849f-7fdc92e04308","Type":"ContainerStarted","Data":"19c3911dd52e206a182215b6835faacda50ae4e992d939323d41c691618891d1"} Jan 28 18:40:28 crc kubenswrapper[4749]: I0128 18:40:28.895915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" event={"ID":"89b92063-36b4-4def-849f-7fdc92e04308","Type":"ContainerStarted","Data":"de467f2bb2d2d649e77c8065eece67d43e01d23ac38f05e2024620cf418cb512"} Jan 28 18:40:28 crc kubenswrapper[4749]: I0128 18:40:28.923522 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-m52fk" podStartSLOduration=2.122373861 podStartE2EDuration="3.923500457s" podCreationTimestamp="2026-01-28 18:40:25 +0000 UTC" firstStartedPulling="2026-01-28 18:40:26.778422809 +0000 UTC m=+294.789949584" lastFinishedPulling="2026-01-28 18:40:28.579549405 +0000 UTC m=+296.591076180" observedRunningTime="2026-01-28 18:40:28.919213234 +0000 UTC m=+296.930740009" watchObservedRunningTime="2026-01-28 18:40:28.923500457 +0000 UTC m=+296.935027232" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.787495 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw"] Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.789280 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.792004 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-xrpw9" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.792638 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.792697 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.817636 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw"] Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.830272 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rs2xm"] Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.831581 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.837102 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-9q5fd" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.837112 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.841375 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.880700 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp"] Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.881858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.883168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ec2f443-2180-4420-85ac-a10185e096df-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.883217 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x5xh\" (UniqueName: \"kubernetes.io/projected/1ec2f443-2180-4420-85ac-a10185e096df-kube-api-access-8x5xh\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.883248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.883274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.884934 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.885775 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.885833 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-lwg8m" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.886716 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.891041 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp"] Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.984790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-sys\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.984851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.984879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s5mm\" (UniqueName: \"kubernetes.io/projected/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-kube-api-access-2s5mm\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.984909 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-root\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.984930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.984966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.984996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-metrics-client-ca\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfdrq\" (UniqueName: \"kubernetes.io/projected/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-api-access-cfdrq\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985070 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-wtmp\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985098 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ec2f443-2180-4420-85ac-a10185e096df-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985140 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x5xh\" (UniqueName: \"kubernetes.io/projected/1ec2f443-2180-4420-85ac-a10185e096df-kube-api-access-8x5xh\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-tls\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.985213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-textfile\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:30 crc kubenswrapper[4749]: I0128 18:40:30.986431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ec2f443-2180-4420-85ac-a10185e096df-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:30 crc kubenswrapper[4749]: E0128 18:40:30.986766 4749 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Jan 28 18:40:30 crc kubenswrapper[4749]: E0128 18:40:30.986809 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-tls podName:1ec2f443-2180-4420-85ac-a10185e096df nodeName:}" failed. No retries permitted until 2026-01-28 18:40:31.486797239 +0000 UTC m=+299.498324014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-tls") pod "openshift-state-metrics-566fddb674-gwsbw" (UID: "1ec2f443-2180-4420-85ac-a10185e096df") : secret "openshift-state-metrics-tls" not found Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.000752 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.002666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x5xh\" (UniqueName: \"kubernetes.io/projected/1ec2f443-2180-4420-85ac-a10185e096df-kube-api-access-8x5xh\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-sys\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s5mm\" (UniqueName: \"kubernetes.io/projected/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-kube-api-access-2s5mm\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-root\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfdrq\" (UniqueName: \"kubernetes.io/projected/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-api-access-cfdrq\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-metrics-client-ca\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086741 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-wtmp\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086865 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-tls\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.086908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-textfile\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.087287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-sys\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.087532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-root\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: E0128 18:40:31.087785 4749 secret.go:188] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Jan 28 18:40:31 crc kubenswrapper[4749]: E0128 18:40:31.087854 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-tls podName:c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb nodeName:}" failed. No retries permitted until 2026-01-28 18:40:31.587834382 +0000 UTC m=+299.599361167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-tls") pod "node-exporter-rs2xm" (UID: "c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb") : secret "node-exporter-tls" not found Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.087907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.087950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-textfile\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.088171 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-wtmp\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.088415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.088588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-metrics-client-ca\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.088906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.093729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.097798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.107138 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.112492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfdrq\" (UniqueName: \"kubernetes.io/projected/a2580e9e-f086-4e94-ac9a-c74ffb5ebb83-kube-api-access-cfdrq\") pod \"kube-state-metrics-777cb5bd5d-mlncp\" (UID: \"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.123935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s5mm\" (UniqueName: \"kubernetes.io/projected/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-kube-api-access-2s5mm\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.199866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.492246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.496950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ec2f443-2180-4420-85ac-a10185e096df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-gwsbw\" (UID: \"1ec2f443-2180-4420-85ac-a10185e096df\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.590688 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp"] Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.594047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-tls\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.598746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb-node-exporter-tls\") pod \"node-exporter-rs2xm\" (UID: \"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb\") " pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.707474 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.746380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rs2xm" Jan 28 18:40:31 crc kubenswrapper[4749]: W0128 18:40:31.792867 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc416a6c9_fe5a_49ed_b0fb_b9ff0aae89eb.slice/crio-90a061e44c161f37343237752280f6c72100db01de707ab58bce8a6862998ce6 WatchSource:0}: Error finding container 90a061e44c161f37343237752280f6c72100db01de707ab58bce8a6862998ce6: Status 404 returned error can't find the container with id 90a061e44c161f37343237752280f6c72100db01de707ab58bce8a6862998ce6 Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.927425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rs2xm" event={"ID":"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb","Type":"ContainerStarted","Data":"90a061e44c161f37343237752280f6c72100db01de707ab58bce8a6862998ce6"} Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.929486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" event={"ID":"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83","Type":"ContainerStarted","Data":"e92873251ce32a6a4613c6d0178ebf23c97e68b7435e33d67a84e71a21f2ca95"} Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.956718 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.959141 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.964061 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.964074 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.964065 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.964458 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-fn7qc" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.964546 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.964633 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.964720 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.965040 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.974828 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Jan 28 18:40:31 crc kubenswrapper[4749]: I0128 18:40:31.988569 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.104677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105181 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bsg8\" (UniqueName: \"kubernetes.io/projected/ccf7372f-949d-493e-b5ea-f005059fc7c9-kube-api-access-8bsg8\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ccf7372f-949d-493e-b5ea-f005059fc7c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ccf7372f-949d-493e-b5ea-f005059fc7c9-config-out\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccf7372f-949d-493e-b5ea-f005059fc7c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccf7372f-949d-493e-b5ea-f005059fc7c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105445 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-web-config\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.105473 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ccf7372f-949d-493e-b5ea-f005059fc7c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.203001 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw"] Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.206764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.206833 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.206862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bsg8\" (UniqueName: \"kubernetes.io/projected/ccf7372f-949d-493e-b5ea-f005059fc7c9-kube-api-access-8bsg8\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.206906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.206929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ccf7372f-949d-493e-b5ea-f005059fc7c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.206967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ccf7372f-949d-493e-b5ea-f005059fc7c9-config-out\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.206988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.207012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccf7372f-949d-493e-b5ea-f005059fc7c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.207036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccf7372f-949d-493e-b5ea-f005059fc7c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.207061 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-web-config\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.207092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ccf7372f-949d-493e-b5ea-f005059fc7c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.207113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.208394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ccf7372f-949d-493e-b5ea-f005059fc7c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.208765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccf7372f-949d-493e-b5ea-f005059fc7c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.209039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccf7372f-949d-493e-b5ea-f005059fc7c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.212843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.213344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.213464 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.215857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ccf7372f-949d-493e-b5ea-f005059fc7c9-config-out\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.216238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ccf7372f-949d-493e-b5ea-f005059fc7c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.216461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.219078 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.223894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ccf7372f-949d-493e-b5ea-f005059fc7c9-web-config\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.228840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bsg8\" (UniqueName: \"kubernetes.io/projected/ccf7372f-949d-493e-b5ea-f005059fc7c9-kube-api-access-8bsg8\") pod \"alertmanager-main-0\" (UID: \"ccf7372f-949d-493e-b5ea-f005059fc7c9\") " pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.292055 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.692686 4749 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.767377 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.901164 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-757c76cf74-pxbgx"] Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.906022 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.915295 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.915389 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.915678 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-dvl5vt1to69e6" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.916509 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.917010 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.918065 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.918806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-7x7fp" Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.936673 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-757c76cf74-pxbgx"] Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.959640 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" event={"ID":"1ec2f443-2180-4420-85ac-a10185e096df","Type":"ContainerStarted","Data":"25020b0972c79596ca9aea440963c03e631a8258defecd55d7b7953058347bcb"} Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.960028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" event={"ID":"1ec2f443-2180-4420-85ac-a10185e096df","Type":"ContainerStarted","Data":"f1f6d146b25aa20989c0c1f8d0409225282bce568c1b666f85bc3b9b19e65473"} Jan 28 18:40:32 crc kubenswrapper[4749]: I0128 18:40:32.960040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" event={"ID":"1ec2f443-2180-4420-85ac-a10185e096df","Type":"ContainerStarted","Data":"867c9107f8aee676863c81bb957e323ef587157a09264589a1b5616f75f4c0ea"} Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.026486 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.026535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.026558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.026587 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98844b8b-96b7-4d57-b73b-980cea001f3a-metrics-client-ca\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.026604 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.026641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-grpc-tls\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.026664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6z7t\" (UniqueName: \"kubernetes.io/projected/98844b8b-96b7-4d57-b73b-980cea001f3a-kube-api-access-j6z7t\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.026712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-tls\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.128348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98844b8b-96b7-4d57-b73b-980cea001f3a-metrics-client-ca\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.128406 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.128437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-grpc-tls\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.128463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6z7t\" (UniqueName: \"kubernetes.io/projected/98844b8b-96b7-4d57-b73b-980cea001f3a-kube-api-access-j6z7t\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.128527 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-tls\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.128561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.128586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.128608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.129799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98844b8b-96b7-4d57-b73b-980cea001f3a-metrics-client-ca\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.136017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-tls\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.136088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.136128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-grpc-tls\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.136209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.137980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.143861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/98844b8b-96b7-4d57-b73b-980cea001f3a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.146245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6z7t\" (UniqueName: \"kubernetes.io/projected/98844b8b-96b7-4d57-b73b-980cea001f3a-kube-api-access-j6z7t\") pod \"thanos-querier-757c76cf74-pxbgx\" (UID: \"98844b8b-96b7-4d57-b73b-980cea001f3a\") " pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.238298 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:33 crc kubenswrapper[4749]: W0128 18:40:33.313255 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccf7372f_949d_493e_b5ea_f005059fc7c9.slice/crio-144fadd93f9f14436fa300c64c430683f03e26d2693c2621e7c249df070d38e7 WatchSource:0}: Error finding container 144fadd93f9f14436fa300c64c430683f03e26d2693c2621e7c249df070d38e7: Status 404 returned error can't find the container with id 144fadd93f9f14436fa300c64c430683f03e26d2693c2621e7c249df070d38e7 Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.972280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ccf7372f-949d-493e-b5ea-f005059fc7c9","Type":"ContainerStarted","Data":"144fadd93f9f14436fa300c64c430683f03e26d2693c2621e7c249df070d38e7"} Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.981505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" event={"ID":"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83","Type":"ContainerStarted","Data":"1e8ce332556a40fb6b8ecb26a1d76a689368a86b319368c1775ae0138eb4afba"} Jan 28 18:40:33 crc kubenswrapper[4749]: I0128 18:40:33.985656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rs2xm" event={"ID":"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb","Type":"ContainerStarted","Data":"378d8c3c3ed9997a3b232551bab40d52a5a6bd17cec3e8515ec46f601bf6d88c"} Jan 28 18:40:34 crc kubenswrapper[4749]: I0128 18:40:34.179972 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-757c76cf74-pxbgx"] Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.002189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" event={"ID":"98844b8b-96b7-4d57-b73b-980cea001f3a","Type":"ContainerStarted","Data":"f7811f3a69d81a7d077f60b437158304afa1cb43b04aef5ecff7b7bfb45338eb"} Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.009980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" event={"ID":"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83","Type":"ContainerStarted","Data":"1c5376bb9c52287e426053e8479892f3b3b877a5c97e50345f9ad4d760c146d4"} Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.012679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" event={"ID":"1ec2f443-2180-4420-85ac-a10185e096df","Type":"ContainerStarted","Data":"28ed8daf543a6df5a234cf28168d0b223d6ab67bb220fc891a98b4bd5ee56747"} Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.014750 4749 generic.go:334] "Generic (PLEG): container finished" podID="c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb" containerID="378d8c3c3ed9997a3b232551bab40d52a5a6bd17cec3e8515ec46f601bf6d88c" exitCode=0 Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.014787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rs2xm" event={"ID":"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb","Type":"ContainerDied","Data":"378d8c3c3ed9997a3b232551bab40d52a5a6bd17cec3e8515ec46f601bf6d88c"} Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.037153 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-gwsbw" podStartSLOduration=2.831212723 podStartE2EDuration="5.037124701s" podCreationTimestamp="2026-01-28 18:40:30 +0000 UTC" firstStartedPulling="2026-01-28 18:40:32.636937338 +0000 UTC m=+300.648464113" lastFinishedPulling="2026-01-28 18:40:34.842849296 +0000 UTC m=+302.854376091" observedRunningTime="2026-01-28 18:40:35.029565759 +0000 UTC m=+303.041092554" watchObservedRunningTime="2026-01-28 18:40:35.037124701 +0000 UTC m=+303.048651486" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.615139 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-547bf4ccf6-h9jzh"] Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.616633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.674426 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547bf4ccf6-h9jzh"] Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.772115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-oauth-config\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.772168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-console-config\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.772362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-serving-cert\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.772408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-trusted-ca-bundle\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.772437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-oauth-serving-cert\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.772476 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwq72\" (UniqueName: \"kubernetes.io/projected/843ecf05-c061-4825-a1df-0f1afc7353e2-kube-api-access-fwq72\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.772506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-service-ca\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.873274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-trusted-ca-bundle\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.873374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-oauth-serving-cert\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.873417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwq72\" (UniqueName: \"kubernetes.io/projected/843ecf05-c061-4825-a1df-0f1afc7353e2-kube-api-access-fwq72\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.873448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-service-ca\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.873485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-oauth-config\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.873510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-console-config\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.873573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-serving-cert\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.875475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-oauth-serving-cert\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.875983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-trusted-ca-bundle\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.876130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-service-ca\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.876318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-console-config\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.881215 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-oauth-config\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.889444 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-serving-cert\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.899943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwq72\" (UniqueName: \"kubernetes.io/projected/843ecf05-c061-4825-a1df-0f1afc7353e2-kube-api-access-fwq72\") pod \"console-547bf4ccf6-h9jzh\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:35 crc kubenswrapper[4749]: I0128 18:40:35.934124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.021496 4749 generic.go:334] "Generic (PLEG): container finished" podID="ccf7372f-949d-493e-b5ea-f005059fc7c9" containerID="abb8ac39cb3ca9808fdab1c89d638663efcb6aea5c2ade995d6f6fa713b11c5e" exitCode=0 Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.021577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ccf7372f-949d-493e-b5ea-f005059fc7c9","Type":"ContainerDied","Data":"abb8ac39cb3ca9808fdab1c89d638663efcb6aea5c2ade995d6f6fa713b11c5e"} Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.025184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" event={"ID":"a2580e9e-f086-4e94-ac9a-c74ffb5ebb83","Type":"ContainerStarted","Data":"d5c7076edaa0e55d09e786ddaaf74ff08703c26683664eb6adac39dca8ae686c"} Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.028140 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rs2xm" event={"ID":"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb","Type":"ContainerStarted","Data":"7419797c7d72ff15b6d4535e99a02542ae6a33702f1834fe2ed5af9d0bf9d547"} Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.028185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rs2xm" event={"ID":"c416a6c9-fe5a-49ed-b0fb-b9ff0aae89eb","Type":"ContainerStarted","Data":"9401d64d7ccc417883434b2a633288ac71adb406646996f5a3bf4d152cdf9784"} Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.083905 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mlncp" podStartSLOduration=3.910598217 podStartE2EDuration="6.083881468s" podCreationTimestamp="2026-01-28 18:40:30 +0000 UTC" firstStartedPulling="2026-01-28 18:40:31.60130928 +0000 UTC m=+299.612836055" lastFinishedPulling="2026-01-28 18:40:33.774592531 +0000 UTC m=+301.786119306" observedRunningTime="2026-01-28 18:40:36.068581338 +0000 UTC m=+304.080108113" watchObservedRunningTime="2026-01-28 18:40:36.083881468 +0000 UTC m=+304.095408243" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.110855 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rs2xm" podStartSLOduration=4.130272846 podStartE2EDuration="6.110815179s" podCreationTimestamp="2026-01-28 18:40:30 +0000 UTC" firstStartedPulling="2026-01-28 18:40:31.795457032 +0000 UTC m=+299.806983807" lastFinishedPulling="2026-01-28 18:40:33.775999375 +0000 UTC m=+301.787526140" observedRunningTime="2026-01-28 18:40:36.095941749 +0000 UTC m=+304.107468524" watchObservedRunningTime="2026-01-28 18:40:36.110815179 +0000 UTC m=+304.122341954" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.126382 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-555d778d79-lwsht"] Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.127183 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.131498 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.131853 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.131984 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.132269 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dng3mk53ifnh9" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.132404 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.132485 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-hm8wn" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.143246 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-555d778d79-lwsht"] Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.280702 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-secret-metrics-client-certs\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.281147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-client-ca-bundle\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.281231 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/559588ad-0b94-4587-9a9c-94fae9fdd016-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.281292 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/559588ad-0b94-4587-9a9c-94fae9fdd016-metrics-server-audit-profiles\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.281361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mfzv\" (UniqueName: \"kubernetes.io/projected/559588ad-0b94-4587-9a9c-94fae9fdd016-kube-api-access-7mfzv\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.281413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-secret-metrics-server-tls\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.281458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/559588ad-0b94-4587-9a9c-94fae9fdd016-audit-log\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.382718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/559588ad-0b94-4587-9a9c-94fae9fdd016-audit-log\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.382773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-secret-metrics-client-certs\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.382803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-client-ca-bundle\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.382833 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/559588ad-0b94-4587-9a9c-94fae9fdd016-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.382868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/559588ad-0b94-4587-9a9c-94fae9fdd016-metrics-server-audit-profiles\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.382919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mfzv\" (UniqueName: \"kubernetes.io/projected/559588ad-0b94-4587-9a9c-94fae9fdd016-kube-api-access-7mfzv\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.382970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-secret-metrics-server-tls\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.383441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/559588ad-0b94-4587-9a9c-94fae9fdd016-audit-log\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.384547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/559588ad-0b94-4587-9a9c-94fae9fdd016-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.385051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/559588ad-0b94-4587-9a9c-94fae9fdd016-metrics-server-audit-profiles\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.387056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-secret-metrics-client-certs\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.393549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-secret-metrics-server-tls\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.399108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559588ad-0b94-4587-9a9c-94fae9fdd016-client-ca-bundle\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.400387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mfzv\" (UniqueName: \"kubernetes.io/projected/559588ad-0b94-4587-9a9c-94fae9fdd016-kube-api-access-7mfzv\") pod \"metrics-server-555d778d79-lwsht\" (UID: \"559588ad-0b94-4587-9a9c-94fae9fdd016\") " pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.465458 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.591870 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl"] Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.592817 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.595082 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.596405 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.608436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl"] Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.687833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd6abfb8-3fd8-4c78-8974-1b017fd826ae-monitoring-plugin-cert\") pod \"monitoring-plugin-6fdcc87c5d-mlwzl\" (UID: \"dd6abfb8-3fd8-4c78-8974-1b017fd826ae\") " pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.790115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd6abfb8-3fd8-4c78-8974-1b017fd826ae-monitoring-plugin-cert\") pod \"monitoring-plugin-6fdcc87c5d-mlwzl\" (UID: \"dd6abfb8-3fd8-4c78-8974-1b017fd826ae\") " pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.800547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd6abfb8-3fd8-4c78-8974-1b017fd826ae-monitoring-plugin-cert\") pod \"monitoring-plugin-6fdcc87c5d-mlwzl\" (UID: \"dd6abfb8-3fd8-4c78-8974-1b017fd826ae\") " pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" Jan 28 18:40:36 crc kubenswrapper[4749]: I0128 18:40:36.958955 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.044561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" event={"ID":"98844b8b-96b7-4d57-b73b-980cea001f3a","Type":"ContainerStarted","Data":"114082d9967ee53f79ac9653fb3a9b6486b30c0311088a4ec640dcdeb96fe94f"} Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.044619 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" event={"ID":"98844b8b-96b7-4d57-b73b-980cea001f3a","Type":"ContainerStarted","Data":"0edb347fd884c0931abab7d55bbb36ea78c836f4beb800e005f8b3b02dc51fc6"} Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.088153 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547bf4ccf6-h9jzh"] Jan 28 18:40:37 crc kubenswrapper[4749]: W0128 18:40:37.097454 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod843ecf05_c061_4825_a1df_0f1afc7353e2.slice/crio-d1080e2eb4bf20389cf3db4713e60ec60cf9ed9db4f5318ad511e46751ea6a81 WatchSource:0}: Error finding container d1080e2eb4bf20389cf3db4713e60ec60cf9ed9db4f5318ad511e46751ea6a81: Status 404 returned error can't find the container with id d1080e2eb4bf20389cf3db4713e60ec60cf9ed9db4f5318ad511e46751ea6a81 Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.167300 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-555d778d79-lwsht"] Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.195286 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.204844 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.216204 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.216407 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-q9q8t" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.216528 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.216943 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.217363 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.217802 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.217979 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.218179 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-deb8bbt76ng2p" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.218411 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.218794 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.219671 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.225952 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.228701 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.236155 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.297983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298379 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298448 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c7585bb1-e81c-4381-9105-ee1a0843d724-config-out\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-web-config\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-config\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.298869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.299013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmb29\" (UniqueName: \"kubernetes.io/projected/c7585bb1-e81c-4381-9105-ee1a0843d724-kube-api-access-vmb29\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.299077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.299110 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c7585bb1-e81c-4381-9105-ee1a0843d724-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.299172 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.299195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.299248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401254 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c7585bb1-e81c-4381-9105-ee1a0843d724-config-out\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-web-config\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-config\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmb29\" (UniqueName: \"kubernetes.io/projected/c7585bb1-e81c-4381-9105-ee1a0843d724-kube-api-access-vmb29\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c7585bb1-e81c-4381-9105-ee1a0843d724-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.401560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.402716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.403443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.403683 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.406272 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c7585bb1-e81c-4381-9105-ee1a0843d724-config-out\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.407004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-config\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.407983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.408100 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c7585bb1-e81c-4381-9105-ee1a0843d724-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.408170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.408347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.410964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.411348 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.415050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl"] Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.415611 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.417642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.418036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.418694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-web-config\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.419152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c7585bb1-e81c-4381-9105-ee1a0843d724-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.421936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmb29\" (UniqueName: \"kubernetes.io/projected/c7585bb1-e81c-4381-9105-ee1a0843d724-kube-api-access-vmb29\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.424736 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c7585bb1-e81c-4381-9105-ee1a0843d724-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c7585bb1-e81c-4381-9105-ee1a0843d724\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: W0128 18:40:37.434658 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd6abfb8_3fd8_4c78_8974_1b017fd826ae.slice/crio-fc29ab8159fc7c2ab0a72190f3688ce577c8e2353e9ae4bc10a69cae4ce62497 WatchSource:0}: Error finding container fc29ab8159fc7c2ab0a72190f3688ce577c8e2353e9ae4bc10a69cae4ce62497: Status 404 returned error can't find the container with id fc29ab8159fc7c2ab0a72190f3688ce577c8e2353e9ae4bc10a69cae4ce62497 Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.544630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:37 crc kubenswrapper[4749]: I0128 18:40:37.959767 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 28 18:40:38 crc kubenswrapper[4749]: I0128 18:40:38.050885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" event={"ID":"dd6abfb8-3fd8-4c78-8974-1b017fd826ae","Type":"ContainerStarted","Data":"fc29ab8159fc7c2ab0a72190f3688ce577c8e2353e9ae4bc10a69cae4ce62497"} Jan 28 18:40:38 crc kubenswrapper[4749]: I0128 18:40:38.053368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547bf4ccf6-h9jzh" event={"ID":"843ecf05-c061-4825-a1df-0f1afc7353e2","Type":"ContainerStarted","Data":"df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c"} Jan 28 18:40:38 crc kubenswrapper[4749]: I0128 18:40:38.053410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547bf4ccf6-h9jzh" event={"ID":"843ecf05-c061-4825-a1df-0f1afc7353e2","Type":"ContainerStarted","Data":"d1080e2eb4bf20389cf3db4713e60ec60cf9ed9db4f5318ad511e46751ea6a81"} Jan 28 18:40:38 crc kubenswrapper[4749]: I0128 18:40:38.054756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" event={"ID":"559588ad-0b94-4587-9a9c-94fae9fdd016","Type":"ContainerStarted","Data":"8109b714ed76bb1989690c7247980b9aeb4b75fc68973fd29ea548a6e451c7ab"} Jan 28 18:40:38 crc kubenswrapper[4749]: I0128 18:40:38.057878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" event={"ID":"98844b8b-96b7-4d57-b73b-980cea001f3a","Type":"ContainerStarted","Data":"04bed55ce648aac1211602d02eab5e3a607ab4fa6b3cdd991f02fb58acaa86d6"} Jan 28 18:40:38 crc kubenswrapper[4749]: I0128 18:40:38.078885 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-547bf4ccf6-h9jzh" podStartSLOduration=3.07886401 podStartE2EDuration="3.07886401s" podCreationTimestamp="2026-01-28 18:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:40:38.073852508 +0000 UTC m=+306.085379303" watchObservedRunningTime="2026-01-28 18:40:38.07886401 +0000 UTC m=+306.090390785" Jan 28 18:40:38 crc kubenswrapper[4749]: W0128 18:40:38.521577 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7585bb1_e81c_4381_9105_ee1a0843d724.slice/crio-77c62c017a31a2d8e043c5e2f8cdb7f400d66d42c61eb2616cf0ca05b940f0c4 WatchSource:0}: Error finding container 77c62c017a31a2d8e043c5e2f8cdb7f400d66d42c61eb2616cf0ca05b940f0c4: Status 404 returned error can't find the container with id 77c62c017a31a2d8e043c5e2f8cdb7f400d66d42c61eb2616cf0ca05b940f0c4 Jan 28 18:40:39 crc kubenswrapper[4749]: I0128 18:40:39.077092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7585bb1-e81c-4381-9105-ee1a0843d724","Type":"ContainerStarted","Data":"77c62c017a31a2d8e043c5e2f8cdb7f400d66d42c61eb2616cf0ca05b940f0c4"} Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.105032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ccf7372f-949d-493e-b5ea-f005059fc7c9","Type":"ContainerStarted","Data":"141cd975465ef764ee07421acf687c2da5a989cdf74264f2640a63ae020df7bd"} Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.106532 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" event={"ID":"dd6abfb8-3fd8-4c78-8974-1b017fd826ae","Type":"ContainerStarted","Data":"b0e3ff3eedeb3a245e8328c6eb7f124869810e82b02f3f386068216af9e94454"} Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.106838 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.108606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" event={"ID":"559588ad-0b94-4587-9a9c-94fae9fdd016","Type":"ContainerStarted","Data":"7abf500e7968c32ecf4bdb7e3907aff199c48004cd020ba3297089bd0f6c5ddd"} Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.110119 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7585bb1-e81c-4381-9105-ee1a0843d724" containerID="e080267a8e236258e35c9f7064d6420fd47654bd4398e1bc8f2c726e93f8ae2f" exitCode=0 Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.110148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7585bb1-e81c-4381-9105-ee1a0843d724","Type":"ContainerDied","Data":"e080267a8e236258e35c9f7064d6420fd47654bd4398e1bc8f2c726e93f8ae2f"} Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.113033 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.115478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" event={"ID":"98844b8b-96b7-4d57-b73b-980cea001f3a","Type":"ContainerStarted","Data":"9813e6840d7710e22252381ecc5bdee2a3dde1a3c9cee92380d3db9687fb26c7"} Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.127063 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6fdcc87c5d-mlwzl" podStartSLOduration=2.9161852440000002 podStartE2EDuration="6.127047509s" podCreationTimestamp="2026-01-28 18:40:36 +0000 UTC" firstStartedPulling="2026-01-28 18:40:37.437361577 +0000 UTC m=+305.448888352" lastFinishedPulling="2026-01-28 18:40:40.648223782 +0000 UTC m=+308.659750617" observedRunningTime="2026-01-28 18:40:42.123208506 +0000 UTC m=+310.134735291" watchObservedRunningTime="2026-01-28 18:40:42.127047509 +0000 UTC m=+310.138574284" Jan 28 18:40:42 crc kubenswrapper[4749]: I0128 18:40:42.183153 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" podStartSLOduration=2.724956594 podStartE2EDuration="6.183137255s" podCreationTimestamp="2026-01-28 18:40:36 +0000 UTC" firstStartedPulling="2026-01-28 18:40:37.178685226 +0000 UTC m=+305.190212001" lastFinishedPulling="2026-01-28 18:40:40.636865887 +0000 UTC m=+308.648392662" observedRunningTime="2026-01-28 18:40:42.183113964 +0000 UTC m=+310.194640749" watchObservedRunningTime="2026-01-28 18:40:42.183137255 +0000 UTC m=+310.194664030" Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.124909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" event={"ID":"98844b8b-96b7-4d57-b73b-980cea001f3a","Type":"ContainerStarted","Data":"e0d33dd96ff8b8698735c77e0688c77cce3fb9637c130761ffa588b68dc3ecc3"} Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.125303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" event={"ID":"98844b8b-96b7-4d57-b73b-980cea001f3a","Type":"ContainerStarted","Data":"8178e40a6d0923624ae3638c6fb47a07498ab207eaffbd14311981a712e7b7a9"} Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.125720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.132870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ccf7372f-949d-493e-b5ea-f005059fc7c9","Type":"ContainerStarted","Data":"dd99c021a08839ab201cbdee6d0ea0df4e2c460c145b6b7254eb4300d055a2dc"} Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.132919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ccf7372f-949d-493e-b5ea-f005059fc7c9","Type":"ContainerStarted","Data":"b4f8ef57c885a93a852774a61530989a60549ad21ce6f912c8ff16cf00cc2962"} Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.132937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ccf7372f-949d-493e-b5ea-f005059fc7c9","Type":"ContainerStarted","Data":"d0143ff9d136e3521a99585ff548c04c46087792549b3b77f6a4fb4434c0612e"} Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.132949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ccf7372f-949d-493e-b5ea-f005059fc7c9","Type":"ContainerStarted","Data":"bf8cbe69c7dcee5474ccc082e5c773494555463d703a388a16f71205d67bcf79"} Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.143138 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" Jan 28 18:40:43 crc kubenswrapper[4749]: I0128 18:40:43.152244 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-757c76cf74-pxbgx" podStartSLOduration=4.90995507 podStartE2EDuration="11.152149182s" podCreationTimestamp="2026-01-28 18:40:32 +0000 UTC" firstStartedPulling="2026-01-28 18:40:34.393565839 +0000 UTC m=+302.405092614" lastFinishedPulling="2026-01-28 18:40:40.635759951 +0000 UTC m=+308.647286726" observedRunningTime="2026-01-28 18:40:43.148200867 +0000 UTC m=+311.159727682" watchObservedRunningTime="2026-01-28 18:40:43.152149182 +0000 UTC m=+311.163675957" Jan 28 18:40:44 crc kubenswrapper[4749]: I0128 18:40:44.144990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ccf7372f-949d-493e-b5ea-f005059fc7c9","Type":"ContainerStarted","Data":"00ec8064b27d80d0cf2e41f85dd06019e4f9180e14a5a37fc838cf9a181391cf"} Jan 28 18:40:45 crc kubenswrapper[4749]: I0128 18:40:45.936147 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:45 crc kubenswrapper[4749]: I0128 18:40:45.936205 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:45 crc kubenswrapper[4749]: I0128 18:40:45.942802 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:45 crc kubenswrapper[4749]: I0128 18:40:45.961979 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=7.892676096 podStartE2EDuration="14.961957245s" podCreationTimestamp="2026-01-28 18:40:31 +0000 UTC" firstStartedPulling="2026-01-28 18:40:33.315778173 +0000 UTC m=+301.327304948" lastFinishedPulling="2026-01-28 18:40:40.385059322 +0000 UTC m=+308.396586097" observedRunningTime="2026-01-28 18:40:44.176169239 +0000 UTC m=+312.187696024" watchObservedRunningTime="2026-01-28 18:40:45.961957245 +0000 UTC m=+313.973484020" Jan 28 18:40:46 crc kubenswrapper[4749]: I0128 18:40:46.161889 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:40:46 crc kubenswrapper[4749]: I0128 18:40:46.213114 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kjr8m"] Jan 28 18:40:47 crc kubenswrapper[4749]: I0128 18:40:47.173752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7585bb1-e81c-4381-9105-ee1a0843d724","Type":"ContainerStarted","Data":"c40e5f728bd9bc574dd48692771a5bf591cb8c726f71da106201645ba208f5c7"} Jan 28 18:40:47 crc kubenswrapper[4749]: I0128 18:40:47.175412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7585bb1-e81c-4381-9105-ee1a0843d724","Type":"ContainerStarted","Data":"01b02390ea4a5fa08a32a6bcd0315912a794955f418373445698592b98aebd72"} Jan 28 18:40:47 crc kubenswrapper[4749]: I0128 18:40:47.175499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7585bb1-e81c-4381-9105-ee1a0843d724","Type":"ContainerStarted","Data":"6b800867948ce00e029ab55f8f63f2503f712fdc49ddfe6d4c5830d14bee591e"} Jan 28 18:40:47 crc kubenswrapper[4749]: I0128 18:40:47.175524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7585bb1-e81c-4381-9105-ee1a0843d724","Type":"ContainerStarted","Data":"a5b55ac6c1ee93b4f022d3e6df70b340704e731748feda8b17fd3dfaaeeff0a9"} Jan 28 18:40:47 crc kubenswrapper[4749]: I0128 18:40:47.175547 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7585bb1-e81c-4381-9105-ee1a0843d724","Type":"ContainerStarted","Data":"47c8917e77f6a9f918b0f5a743edaf4e5d1725f3531970d3222338c25eff7649"} Jan 28 18:40:48 crc kubenswrapper[4749]: I0128 18:40:48.184816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7585bb1-e81c-4381-9105-ee1a0843d724","Type":"ContainerStarted","Data":"63993db9f1dcf551f0d6cae578324fd95ffc318d948c94426e260375b4d40b3f"} Jan 28 18:40:48 crc kubenswrapper[4749]: I0128 18:40:48.213597 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=7.159010944 podStartE2EDuration="11.213562348s" podCreationTimestamp="2026-01-28 18:40:37 +0000 UTC" firstStartedPulling="2026-01-28 18:40:42.111991325 +0000 UTC m=+310.123518100" lastFinishedPulling="2026-01-28 18:40:46.166542719 +0000 UTC m=+314.178069504" observedRunningTime="2026-01-28 18:40:48.208183778 +0000 UTC m=+316.219710553" watchObservedRunningTime="2026-01-28 18:40:48.213562348 +0000 UTC m=+316.225089123" Jan 28 18:40:52 crc kubenswrapper[4749]: I0128 18:40:52.545711 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:40:56 crc kubenswrapper[4749]: I0128 18:40:56.466018 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:56 crc kubenswrapper[4749]: I0128 18:40:56.466077 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:40:57 crc kubenswrapper[4749]: I0128 18:40:57.467195 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:40:57 crc kubenswrapper[4749]: I0128 18:40:57.467252 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:41:11 crc kubenswrapper[4749]: I0128 18:41:11.256202 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-kjr8m" podUID="89157053-d5d1-40f0-8b36-411d637d8385" containerName="console" containerID="cri-o://8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed" gracePeriod=15 Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.206051 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kjr8m_89157053-d5d1-40f0-8b36-411d637d8385/console/0.log" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.206450 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.341361 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kjr8m_89157053-d5d1-40f0-8b36-411d637d8385/console/0.log" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.341412 4749 generic.go:334] "Generic (PLEG): container finished" podID="89157053-d5d1-40f0-8b36-411d637d8385" containerID="8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed" exitCode=2 Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.341442 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kjr8m" event={"ID":"89157053-d5d1-40f0-8b36-411d637d8385","Type":"ContainerDied","Data":"8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed"} Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.341475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kjr8m" event={"ID":"89157053-d5d1-40f0-8b36-411d637d8385","Type":"ContainerDied","Data":"b10ad17b9a532ff3723c33954acdbd1251567d400e6d17feb30622ba7fc10934"} Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.341491 4749 scope.go:117] "RemoveContainer" containerID="8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.341490 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kjr8m" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.351992 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-trusted-ca-bundle\") pod \"89157053-d5d1-40f0-8b36-411d637d8385\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.352062 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-console-config\") pod \"89157053-d5d1-40f0-8b36-411d637d8385\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.352093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-service-ca\") pod \"89157053-d5d1-40f0-8b36-411d637d8385\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.352157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-serving-cert\") pod \"89157053-d5d1-40f0-8b36-411d637d8385\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.352201 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49l9q\" (UniqueName: \"kubernetes.io/projected/89157053-d5d1-40f0-8b36-411d637d8385-kube-api-access-49l9q\") pod \"89157053-d5d1-40f0-8b36-411d637d8385\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.352224 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-oauth-serving-cert\") pod \"89157053-d5d1-40f0-8b36-411d637d8385\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.352244 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-oauth-config\") pod \"89157053-d5d1-40f0-8b36-411d637d8385\" (UID: \"89157053-d5d1-40f0-8b36-411d637d8385\") " Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.353070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "89157053-d5d1-40f0-8b36-411d637d8385" (UID: "89157053-d5d1-40f0-8b36-411d637d8385"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.353127 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "89157053-d5d1-40f0-8b36-411d637d8385" (UID: "89157053-d5d1-40f0-8b36-411d637d8385"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.353353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-service-ca" (OuterVolumeSpecName: "service-ca") pod "89157053-d5d1-40f0-8b36-411d637d8385" (UID: "89157053-d5d1-40f0-8b36-411d637d8385"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.353810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-console-config" (OuterVolumeSpecName: "console-config") pod "89157053-d5d1-40f0-8b36-411d637d8385" (UID: "89157053-d5d1-40f0-8b36-411d637d8385"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.357264 4749 scope.go:117] "RemoveContainer" containerID="8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed" Jan 28 18:41:12 crc kubenswrapper[4749]: E0128 18:41:12.357668 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed\": container with ID starting with 8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed not found: ID does not exist" containerID="8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.357713 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed"} err="failed to get container status \"8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed\": rpc error: code = NotFound desc = could not find container \"8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed\": container with ID starting with 8e29304934b38c49c15203687e6eb37185c309d52a4d5902d6085cee58e9c2ed not found: ID does not exist" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.359675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "89157053-d5d1-40f0-8b36-411d637d8385" (UID: "89157053-d5d1-40f0-8b36-411d637d8385"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.359767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89157053-d5d1-40f0-8b36-411d637d8385-kube-api-access-49l9q" (OuterVolumeSpecName: "kube-api-access-49l9q") pod "89157053-d5d1-40f0-8b36-411d637d8385" (UID: "89157053-d5d1-40f0-8b36-411d637d8385"). InnerVolumeSpecName "kube-api-access-49l9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.360110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "89157053-d5d1-40f0-8b36-411d637d8385" (UID: "89157053-d5d1-40f0-8b36-411d637d8385"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.453686 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.453724 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.453733 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.453742 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.453753 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49l9q\" (UniqueName: \"kubernetes.io/projected/89157053-d5d1-40f0-8b36-411d637d8385-kube-api-access-49l9q\") on node \"crc\" DevicePath \"\"" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.453764 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89157053-d5d1-40f0-8b36-411d637d8385-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.453773 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89157053-d5d1-40f0-8b36-411d637d8385-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.684208 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kjr8m"] Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.690156 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-kjr8m"] Jan 28 18:41:12 crc kubenswrapper[4749]: I0128 18:41:12.879133 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89157053-d5d1-40f0-8b36-411d637d8385" path="/var/lib/kubelet/pods/89157053-d5d1-40f0-8b36-411d637d8385/volumes" Jan 28 18:41:16 crc kubenswrapper[4749]: I0128 18:41:16.475286 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:41:16 crc kubenswrapper[4749]: I0128 18:41:16.484983 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" Jan 28 18:41:27 crc kubenswrapper[4749]: I0128 18:41:27.467187 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:41:27 crc kubenswrapper[4749]: I0128 18:41:27.467924 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:41:37 crc kubenswrapper[4749]: I0128 18:41:37.546707 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:41:37 crc kubenswrapper[4749]: I0128 18:41:37.581559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:41:38 crc kubenswrapper[4749]: I0128 18:41:38.545832 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Jan 28 18:41:41 crc kubenswrapper[4749]: I0128 18:41:41.211902 4749 scope.go:117] "RemoveContainer" containerID="9c8e18a83f57fe165f256691e9cc0129f62a6500b36edfd810df585e0fe3dd5c" Jan 28 18:41:41 crc kubenswrapper[4749]: I0128 18:41:41.229113 4749 scope.go:117] "RemoveContainer" containerID="08fefda8eb2c2c263f24f28e552e214c2b2a0c30bc56650ac56150436ace2092" Jan 28 18:41:41 crc kubenswrapper[4749]: I0128 18:41:41.244120 4749 scope.go:117] "RemoveContainer" containerID="82069124e53fc0fdea95ba68babe5439a495896b5d4af1cc002f241a712c8d7e" Jan 28 18:41:41 crc kubenswrapper[4749]: I0128 18:41:41.261622 4749 scope.go:117] "RemoveContainer" containerID="f08bf1eafd315b11be73143b39c66ad6a0fb1c378c046053946992360aed3c16" Jan 28 18:41:41 crc kubenswrapper[4749]: I0128 18:41:41.284440 4749 scope.go:117] "RemoveContainer" containerID="02274c5c85ffe39595aa5f860f2e84d7b572ea8f4f53edcc25dffb24adae0b71" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.117001 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54b4fd76b5-v7gv6"] Jan 28 18:41:49 crc kubenswrapper[4749]: E0128 18:41:49.118005 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89157053-d5d1-40f0-8b36-411d637d8385" containerName="console" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.118022 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="89157053-d5d1-40f0-8b36-411d637d8385" containerName="console" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.118175 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="89157053-d5d1-40f0-8b36-411d637d8385" containerName="console" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.118683 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.135871 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54b4fd76b5-v7gv6"] Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.226067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-oauth-config\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.226131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-console-config\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.226175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-service-ca\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.226212 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgbn8\" (UniqueName: \"kubernetes.io/projected/ec672680-fd53-4817-84ef-86edc8d76b8d-kube-api-access-sgbn8\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.226239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-serving-cert\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.226263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-oauth-serving-cert\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.226288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-trusted-ca-bundle\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.327771 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-service-ca\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.328189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgbn8\" (UniqueName: \"kubernetes.io/projected/ec672680-fd53-4817-84ef-86edc8d76b8d-kube-api-access-sgbn8\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.328255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-serving-cert\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.328285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-oauth-serving-cert\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.329115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-trusted-ca-bundle\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.329124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-oauth-serving-cert\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.329227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-oauth-config\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.329263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-console-config\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.329889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-trusted-ca-bundle\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.329966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-console-config\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.330381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-service-ca\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.333114 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-oauth-config\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.335107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-serving-cert\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.344136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgbn8\" (UniqueName: \"kubernetes.io/projected/ec672680-fd53-4817-84ef-86edc8d76b8d-kube-api-access-sgbn8\") pod \"console-54b4fd76b5-v7gv6\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.436661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:49 crc kubenswrapper[4749]: I0128 18:41:49.617591 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54b4fd76b5-v7gv6"] Jan 28 18:41:50 crc kubenswrapper[4749]: I0128 18:41:50.593223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b4fd76b5-v7gv6" event={"ID":"ec672680-fd53-4817-84ef-86edc8d76b8d","Type":"ContainerStarted","Data":"a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b"} Jan 28 18:41:50 crc kubenswrapper[4749]: I0128 18:41:50.593267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b4fd76b5-v7gv6" event={"ID":"ec672680-fd53-4817-84ef-86edc8d76b8d","Type":"ContainerStarted","Data":"1f364c398eb6c0daa74f7ad9a03061c65e27e2faaac7e358319fbc3bfd759c42"} Jan 28 18:41:50 crc kubenswrapper[4749]: I0128 18:41:50.610178 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54b4fd76b5-v7gv6" podStartSLOduration=1.610155986 podStartE2EDuration="1.610155986s" podCreationTimestamp="2026-01-28 18:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:41:50.60780988 +0000 UTC m=+378.619336665" watchObservedRunningTime="2026-01-28 18:41:50.610155986 +0000 UTC m=+378.621682761" Jan 28 18:41:57 crc kubenswrapper[4749]: I0128 18:41:57.468843 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:41:57 crc kubenswrapper[4749]: I0128 18:41:57.469801 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:41:57 crc kubenswrapper[4749]: I0128 18:41:57.469878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:41:57 crc kubenswrapper[4749]: I0128 18:41:57.471022 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c194349366700299c12535062fa3cdc45843fcdc3dd4a7242dce6417a2b9ece"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 18:41:57 crc kubenswrapper[4749]: I0128 18:41:57.471093 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://8c194349366700299c12535062fa3cdc45843fcdc3dd4a7242dce6417a2b9ece" gracePeriod=600 Jan 28 18:41:57 crc kubenswrapper[4749]: I0128 18:41:57.638270 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="8c194349366700299c12535062fa3cdc45843fcdc3dd4a7242dce6417a2b9ece" exitCode=0 Jan 28 18:41:57 crc kubenswrapper[4749]: I0128 18:41:57.638317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"8c194349366700299c12535062fa3cdc45843fcdc3dd4a7242dce6417a2b9ece"} Jan 28 18:41:57 crc kubenswrapper[4749]: I0128 18:41:57.638398 4749 scope.go:117] "RemoveContainer" containerID="4776e4bbf405860b0865ff2250d0eef5141f9c0b4c47049e6cb2d3cde9522949" Jan 28 18:41:58 crc kubenswrapper[4749]: I0128 18:41:58.646256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"1b3f4ad866e9df1f15443a8908d5b0434af59343a634d3e21933697773654ca4"} Jan 28 18:41:59 crc kubenswrapper[4749]: I0128 18:41:59.437290 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:59 crc kubenswrapper[4749]: I0128 18:41:59.437697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:59 crc kubenswrapper[4749]: I0128 18:41:59.442706 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:59 crc kubenswrapper[4749]: I0128 18:41:59.659645 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:41:59 crc kubenswrapper[4749]: I0128 18:41:59.716445 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-547bf4ccf6-h9jzh"] Jan 28 18:42:24 crc kubenswrapper[4749]: I0128 18:42:24.755149 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-547bf4ccf6-h9jzh" podUID="843ecf05-c061-4825-a1df-0f1afc7353e2" containerName="console" containerID="cri-o://df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c" gracePeriod=15 Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.084190 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-547bf4ccf6-h9jzh_843ecf05-c061-4825-a1df-0f1afc7353e2/console/0.log" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.084563 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.182164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-trusted-ca-bundle\") pod \"843ecf05-c061-4825-a1df-0f1afc7353e2\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.182224 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-oauth-serving-cert\") pod \"843ecf05-c061-4825-a1df-0f1afc7353e2\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.182251 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-service-ca\") pod \"843ecf05-c061-4825-a1df-0f1afc7353e2\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.182369 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-serving-cert\") pod \"843ecf05-c061-4825-a1df-0f1afc7353e2\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.182407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-console-config\") pod \"843ecf05-c061-4825-a1df-0f1afc7353e2\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.182436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwq72\" (UniqueName: \"kubernetes.io/projected/843ecf05-c061-4825-a1df-0f1afc7353e2-kube-api-access-fwq72\") pod \"843ecf05-c061-4825-a1df-0f1afc7353e2\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.182466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-oauth-config\") pod \"843ecf05-c061-4825-a1df-0f1afc7353e2\" (UID: \"843ecf05-c061-4825-a1df-0f1afc7353e2\") " Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.183286 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-service-ca" (OuterVolumeSpecName: "service-ca") pod "843ecf05-c061-4825-a1df-0f1afc7353e2" (UID: "843ecf05-c061-4825-a1df-0f1afc7353e2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.183311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-console-config" (OuterVolumeSpecName: "console-config") pod "843ecf05-c061-4825-a1df-0f1afc7353e2" (UID: "843ecf05-c061-4825-a1df-0f1afc7353e2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.183435 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "843ecf05-c061-4825-a1df-0f1afc7353e2" (UID: "843ecf05-c061-4825-a1df-0f1afc7353e2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.183427 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "843ecf05-c061-4825-a1df-0f1afc7353e2" (UID: "843ecf05-c061-4825-a1df-0f1afc7353e2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.187821 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843ecf05-c061-4825-a1df-0f1afc7353e2-kube-api-access-fwq72" (OuterVolumeSpecName: "kube-api-access-fwq72") pod "843ecf05-c061-4825-a1df-0f1afc7353e2" (UID: "843ecf05-c061-4825-a1df-0f1afc7353e2"). InnerVolumeSpecName "kube-api-access-fwq72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.187896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "843ecf05-c061-4825-a1df-0f1afc7353e2" (UID: "843ecf05-c061-4825-a1df-0f1afc7353e2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.187917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "843ecf05-c061-4825-a1df-0f1afc7353e2" (UID: "843ecf05-c061-4825-a1df-0f1afc7353e2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.284411 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.284441 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.284454 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.284464 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.284478 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/843ecf05-c061-4825-a1df-0f1afc7353e2-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.284488 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwq72\" (UniqueName: \"kubernetes.io/projected/843ecf05-c061-4825-a1df-0f1afc7353e2-kube-api-access-fwq72\") on node \"crc\" DevicePath \"\"" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.284499 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/843ecf05-c061-4825-a1df-0f1afc7353e2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.831188 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-547bf4ccf6-h9jzh_843ecf05-c061-4825-a1df-0f1afc7353e2/console/0.log" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.831533 4749 generic.go:334] "Generic (PLEG): container finished" podID="843ecf05-c061-4825-a1df-0f1afc7353e2" containerID="df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c" exitCode=2 Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.831576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547bf4ccf6-h9jzh" event={"ID":"843ecf05-c061-4825-a1df-0f1afc7353e2","Type":"ContainerDied","Data":"df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c"} Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.831623 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547bf4ccf6-h9jzh" event={"ID":"843ecf05-c061-4825-a1df-0f1afc7353e2","Type":"ContainerDied","Data":"d1080e2eb4bf20389cf3db4713e60ec60cf9ed9db4f5318ad511e46751ea6a81"} Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.831644 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547bf4ccf6-h9jzh" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.831648 4749 scope.go:117] "RemoveContainer" containerID="df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.850855 4749 scope.go:117] "RemoveContainer" containerID="df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c" Jan 28 18:42:25 crc kubenswrapper[4749]: E0128 18:42:25.851352 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c\": container with ID starting with df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c not found: ID does not exist" containerID="df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.851434 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c"} err="failed to get container status \"df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c\": rpc error: code = NotFound desc = could not find container \"df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c\": container with ID starting with df3acb6c9bb66cc03ae5ebe7c1bacd8176443e2a754bc92d0eabe5e242a89f3c not found: ID does not exist" Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.865517 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-547bf4ccf6-h9jzh"] Jan 28 18:42:25 crc kubenswrapper[4749]: I0128 18:42:25.870619 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-547bf4ccf6-h9jzh"] Jan 28 18:42:26 crc kubenswrapper[4749]: I0128 18:42:26.878606 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843ecf05-c061-4825-a1df-0f1afc7353e2" path="/var/lib/kubelet/pods/843ecf05-c061-4825-a1df-0f1afc7353e2/volumes" Jan 28 18:42:41 crc kubenswrapper[4749]: I0128 18:42:41.317908 4749 scope.go:117] "RemoveContainer" containerID="61a842e300a4ae0faee4eccf068a7bb5877fb5fb8610b7668e6ad61758eff569" Jan 28 18:42:41 crc kubenswrapper[4749]: I0128 18:42:41.331969 4749 scope.go:117] "RemoveContainer" containerID="9ecb988b34f11480980f3689d27f4f5957c46410631b7330361ef12093692f7a" Jan 28 18:43:41 crc kubenswrapper[4749]: I0128 18:43:41.370761 4749 scope.go:117] "RemoveContainer" containerID="69b5de84c695befd06dc04d0d91fe0134a8a79a90179f81d140205312ecf4fe0" Jan 28 18:43:57 crc kubenswrapper[4749]: I0128 18:43:57.466945 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:43:57 crc kubenswrapper[4749]: I0128 18:43:57.467521 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:44:27 crc kubenswrapper[4749]: I0128 18:44:27.468141 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:44:27 crc kubenswrapper[4749]: I0128 18:44:27.468932 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:44:41 crc kubenswrapper[4749]: I0128 18:44:41.412226 4749 scope.go:117] "RemoveContainer" containerID="776f8e0882a32def36f1f68153ebfae7c3a09c93279505d95a0a53492c539a9c" Jan 28 18:44:57 crc kubenswrapper[4749]: I0128 18:44:57.466902 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:44:57 crc kubenswrapper[4749]: I0128 18:44:57.467647 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:44:57 crc kubenswrapper[4749]: I0128 18:44:57.467716 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:44:57 crc kubenswrapper[4749]: I0128 18:44:57.468598 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b3f4ad866e9df1f15443a8908d5b0434af59343a634d3e21933697773654ca4"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 18:44:57 crc kubenswrapper[4749]: I0128 18:44:57.468692 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://1b3f4ad866e9df1f15443a8908d5b0434af59343a634d3e21933697773654ca4" gracePeriod=600 Jan 28 18:44:57 crc kubenswrapper[4749]: I0128 18:44:57.673655 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="1b3f4ad866e9df1f15443a8908d5b0434af59343a634d3e21933697773654ca4" exitCode=0 Jan 28 18:44:57 crc kubenswrapper[4749]: I0128 18:44:57.673844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"1b3f4ad866e9df1f15443a8908d5b0434af59343a634d3e21933697773654ca4"} Jan 28 18:44:57 crc kubenswrapper[4749]: I0128 18:44:57.673974 4749 scope.go:117] "RemoveContainer" containerID="8c194349366700299c12535062fa3cdc45843fcdc3dd4a7242dce6417a2b9ece" Jan 28 18:44:58 crc kubenswrapper[4749]: I0128 18:44:58.681483 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"04f2d6bfc0ed8ee6f0293f2bd9184234285368c2476cf72bb7d6b248a665883d"} Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.173156 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg"] Jan 28 18:45:00 crc kubenswrapper[4749]: E0128 18:45:00.174393 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843ecf05-c061-4825-a1df-0f1afc7353e2" containerName="console" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.174483 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="843ecf05-c061-4825-a1df-0f1afc7353e2" containerName="console" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.174803 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="843ecf05-c061-4825-a1df-0f1afc7353e2" containerName="console" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.175962 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.179860 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.179930 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.186786 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg"] Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.237871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af8c2c33-582a-4b5f-965c-c23d2be58edf-config-volume\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.237985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af8c2c33-582a-4b5f-965c-c23d2be58edf-secret-volume\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.238098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92lgv\" (UniqueName: \"kubernetes.io/projected/af8c2c33-582a-4b5f-965c-c23d2be58edf-kube-api-access-92lgv\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.340090 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af8c2c33-582a-4b5f-965c-c23d2be58edf-config-volume\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.340134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af8c2c33-582a-4b5f-965c-c23d2be58edf-secret-volume\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.340163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92lgv\" (UniqueName: \"kubernetes.io/projected/af8c2c33-582a-4b5f-965c-c23d2be58edf-kube-api-access-92lgv\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.342151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af8c2c33-582a-4b5f-965c-c23d2be58edf-config-volume\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.346899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af8c2c33-582a-4b5f-965c-c23d2be58edf-secret-volume\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.363043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92lgv\" (UniqueName: \"kubernetes.io/projected/af8c2c33-582a-4b5f-965c-c23d2be58edf-kube-api-access-92lgv\") pod \"collect-profiles-29493765-swqbg\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.530094 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:00 crc kubenswrapper[4749]: I0128 18:45:00.688582 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg"] Jan 28 18:45:01 crc kubenswrapper[4749]: I0128 18:45:01.702006 4749 generic.go:334] "Generic (PLEG): container finished" podID="af8c2c33-582a-4b5f-965c-c23d2be58edf" containerID="ca4fdaf4d36c301591672e5580061dd7ee2ccbd535b5ad03a7f69a40db58ed14" exitCode=0 Jan 28 18:45:01 crc kubenswrapper[4749]: I0128 18:45:01.702129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" event={"ID":"af8c2c33-582a-4b5f-965c-c23d2be58edf","Type":"ContainerDied","Data":"ca4fdaf4d36c301591672e5580061dd7ee2ccbd535b5ad03a7f69a40db58ed14"} Jan 28 18:45:01 crc kubenswrapper[4749]: I0128 18:45:01.702372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" event={"ID":"af8c2c33-582a-4b5f-965c-c23d2be58edf","Type":"ContainerStarted","Data":"88d522ea7dd3c70b99564242375756be9952d587ef326dd477cf90ac94beb301"} Jan 28 18:45:02 crc kubenswrapper[4749]: I0128 18:45:02.911839 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:45:02 crc kubenswrapper[4749]: I0128 18:45:02.972389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af8c2c33-582a-4b5f-965c-c23d2be58edf-config-volume\") pod \"af8c2c33-582a-4b5f-965c-c23d2be58edf\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " Jan 28 18:45:02 crc kubenswrapper[4749]: I0128 18:45:02.973578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af8c2c33-582a-4b5f-965c-c23d2be58edf-config-volume" (OuterVolumeSpecName: "config-volume") pod "af8c2c33-582a-4b5f-965c-c23d2be58edf" (UID: "af8c2c33-582a-4b5f-965c-c23d2be58edf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.073346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92lgv\" (UniqueName: \"kubernetes.io/projected/af8c2c33-582a-4b5f-965c-c23d2be58edf-kube-api-access-92lgv\") pod \"af8c2c33-582a-4b5f-965c-c23d2be58edf\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.073396 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af8c2c33-582a-4b5f-965c-c23d2be58edf-secret-volume\") pod \"af8c2c33-582a-4b5f-965c-c23d2be58edf\" (UID: \"af8c2c33-582a-4b5f-965c-c23d2be58edf\") " Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.073633 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af8c2c33-582a-4b5f-965c-c23d2be58edf-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.079922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8c2c33-582a-4b5f-965c-c23d2be58edf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af8c2c33-582a-4b5f-965c-c23d2be58edf" (UID: "af8c2c33-582a-4b5f-965c-c23d2be58edf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.081749 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8c2c33-582a-4b5f-965c-c23d2be58edf-kube-api-access-92lgv" (OuterVolumeSpecName: "kube-api-access-92lgv") pod "af8c2c33-582a-4b5f-965c-c23d2be58edf" (UID: "af8c2c33-582a-4b5f-965c-c23d2be58edf"). InnerVolumeSpecName "kube-api-access-92lgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.174264 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af8c2c33-582a-4b5f-965c-c23d2be58edf-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.174298 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92lgv\" (UniqueName: \"kubernetes.io/projected/af8c2c33-582a-4b5f-965c-c23d2be58edf-kube-api-access-92lgv\") on node \"crc\" DevicePath \"\"" Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.713029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" event={"ID":"af8c2c33-582a-4b5f-965c-c23d2be58edf","Type":"ContainerDied","Data":"88d522ea7dd3c70b99564242375756be9952d587ef326dd477cf90ac94beb301"} Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.713427 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88d522ea7dd3c70b99564242375756be9952d587ef326dd477cf90ac94beb301" Jan 28 18:45:03 crc kubenswrapper[4749]: I0128 18:45:03.713085 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg" Jan 28 18:46:05 crc kubenswrapper[4749]: I0128 18:46:05.917496 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw"] Jan 28 18:46:05 crc kubenswrapper[4749]: E0128 18:46:05.918297 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8c2c33-582a-4b5f-965c-c23d2be58edf" containerName="collect-profiles" Jan 28 18:46:05 crc kubenswrapper[4749]: I0128 18:46:05.918311 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8c2c33-582a-4b5f-965c-c23d2be58edf" containerName="collect-profiles" Jan 28 18:46:05 crc kubenswrapper[4749]: I0128 18:46:05.918539 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8c2c33-582a-4b5f-965c-c23d2be58edf" containerName="collect-profiles" Jan 28 18:46:05 crc kubenswrapper[4749]: I0128 18:46:05.919762 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:05 crc kubenswrapper[4749]: I0128 18:46:05.923272 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 18:46:05 crc kubenswrapper[4749]: I0128 18:46:05.934553 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw"] Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.052524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pvrk\" (UniqueName: \"kubernetes.io/projected/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-kube-api-access-7pvrk\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.052580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.052697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.154119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pvrk\" (UniqueName: \"kubernetes.io/projected/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-kube-api-access-7pvrk\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.154179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.154264 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.154824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.155109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.186312 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pvrk\" (UniqueName: \"kubernetes.io/projected/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-kube-api-access-7pvrk\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.242143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:06 crc kubenswrapper[4749]: I0128 18:46:06.505760 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw"] Jan 28 18:46:07 crc kubenswrapper[4749]: I0128 18:46:07.118691 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerID="b582c450e8078b0212bc153cde0a7071cec73ad31c55553a9e637101196a584e" exitCode=0 Jan 28 18:46:07 crc kubenswrapper[4749]: I0128 18:46:07.118819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" event={"ID":"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931","Type":"ContainerDied","Data":"b582c450e8078b0212bc153cde0a7071cec73ad31c55553a9e637101196a584e"} Jan 28 18:46:07 crc kubenswrapper[4749]: I0128 18:46:07.119201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" event={"ID":"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931","Type":"ContainerStarted","Data":"6cb756e550bbbf39e995db5f232d77081f7bd994d5e165b35f14f48a945d2a19"} Jan 28 18:46:07 crc kubenswrapper[4749]: I0128 18:46:07.120662 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 18:46:09 crc kubenswrapper[4749]: I0128 18:46:09.133467 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerID="f5914c723e2c48df659efe7656b776f392f9ba9bf5e61d3d2a9758ba4c272b34" exitCode=0 Jan 28 18:46:09 crc kubenswrapper[4749]: I0128 18:46:09.133585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" event={"ID":"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931","Type":"ContainerDied","Data":"f5914c723e2c48df659efe7656b776f392f9ba9bf5e61d3d2a9758ba4c272b34"} Jan 28 18:46:10 crc kubenswrapper[4749]: I0128 18:46:10.142834 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerID="e8995a764ec2edbee525bcbd89cc7c45495ef5e7d18fc7e4ef0653be8156e387" exitCode=0 Jan 28 18:46:10 crc kubenswrapper[4749]: I0128 18:46:10.142883 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" event={"ID":"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931","Type":"ContainerDied","Data":"e8995a764ec2edbee525bcbd89cc7c45495ef5e7d18fc7e4ef0653be8156e387"} Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.336189 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.354589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pvrk\" (UniqueName: \"kubernetes.io/projected/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-kube-api-access-7pvrk\") pod \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.354678 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-util\") pod \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.359624 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-kube-api-access-7pvrk" (OuterVolumeSpecName: "kube-api-access-7pvrk") pod "1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" (UID: "1f22f1a6-aca7-47f1-9cf4-1218d6dd3931"). InnerVolumeSpecName "kube-api-access-7pvrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.373484 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-util" (OuterVolumeSpecName: "util") pod "1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" (UID: "1f22f1a6-aca7-47f1-9cf4-1218d6dd3931"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.455434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-bundle\") pod \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\" (UID: \"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931\") " Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.455900 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pvrk\" (UniqueName: \"kubernetes.io/projected/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-kube-api-access-7pvrk\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.455916 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-util\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.459651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-bundle" (OuterVolumeSpecName: "bundle") pod "1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" (UID: "1f22f1a6-aca7-47f1-9cf4-1218d6dd3931"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:46:11 crc kubenswrapper[4749]: I0128 18:46:11.557256 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f22f1a6-aca7-47f1-9cf4-1218d6dd3931-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:12 crc kubenswrapper[4749]: I0128 18:46:12.157833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" event={"ID":"1f22f1a6-aca7-47f1-9cf4-1218d6dd3931","Type":"ContainerDied","Data":"6cb756e550bbbf39e995db5f232d77081f7bd994d5e165b35f14f48a945d2a19"} Jan 28 18:46:12 crc kubenswrapper[4749]: I0128 18:46:12.157874 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cb756e550bbbf39e995db5f232d77081f7bd994d5e165b35f14f48a945d2a19" Jan 28 18:46:12 crc kubenswrapper[4749]: I0128 18:46:12.157898 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw" Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.340727 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wzvwl"] Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.342060 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovn-controller" containerID="cri-o://45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4" gracePeriod=30 Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.342166 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e" gracePeriod=30 Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.342235 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovn-acl-logging" containerID="cri-o://88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf" gracePeriod=30 Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.342208 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="northd" containerID="cri-o://79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1" gracePeriod=30 Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.342236 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kube-rbac-proxy-node" containerID="cri-o://4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7" gracePeriod=30 Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.342348 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="nbdb" containerID="cri-o://98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" gracePeriod=30 Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.342433 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="sbdb" containerID="cri-o://1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" gracePeriod=30 Jan 28 18:46:17 crc kubenswrapper[4749]: I0128 18:46:17.394521 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovnkube-controller" containerID="cri-o://9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc" gracePeriod=30 Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.204551 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wzvwl_290d31fb-b204-4b3c-84ea-a5d597748b18/ovn-acl-logging/0.log" Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205310 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wzvwl_290d31fb-b204-4b3c-84ea-a5d597748b18/ovn-controller/0.log" Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205745 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc" exitCode=0 Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205771 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" exitCode=0 Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205778 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" exitCode=0 Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205785 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1" exitCode=0 Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205792 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf" exitCode=143 Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205800 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4" exitCode=143 Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc"} Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397"} Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4"} Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205881 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1"} Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf"} Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.205899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4"} Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.207434 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z8jvg_bb6b5d81-5370-425d-a3f6-ebfc447a3d27/kube-multus/0.log" Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.207465 4749 generic.go:334] "Generic (PLEG): container finished" podID="bb6b5d81-5370-425d-a3f6-ebfc447a3d27" containerID="daf25062d286e005e2e33e89ad24c0e39c08bb54adb935c08e1d664dc39d1a98" exitCode=2 Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.207486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z8jvg" event={"ID":"bb6b5d81-5370-425d-a3f6-ebfc447a3d27","Type":"ContainerDied","Data":"daf25062d286e005e2e33e89ad24c0e39c08bb54adb935c08e1d664dc39d1a98"} Jan 28 18:46:18 crc kubenswrapper[4749]: I0128 18:46:18.208196 4749 scope.go:117] "RemoveContainer" containerID="daf25062d286e005e2e33e89ad24c0e39c08bb54adb935c08e1d664dc39d1a98" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.031684 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4 is running failed: container process not found" containerID="98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.031898 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397 is running failed: container process not found" containerID="1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.032798 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397 is running failed: container process not found" containerID="1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.032832 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4 is running failed: container process not found" containerID="98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.033110 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397 is running failed: container process not found" containerID="1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.033215 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="sbdb" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.033794 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4 is running failed: container process not found" containerID="98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.033839 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="nbdb" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.091351 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wzvwl_290d31fb-b204-4b3c-84ea-a5d597748b18/ovn-acl-logging/0.log" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.091885 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wzvwl_290d31fb-b204-4b3c-84ea-a5d597748b18/ovn-controller/0.log" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.092419 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.165997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-node-log\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-etc-openvswitch\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-systemd-units\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166140 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-log-socket\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166172 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fjp9\" (UniqueName: \"kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-script-lib\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/290d31fb-b204-4b3c-84ea-a5d597748b18-ovn-node-metrics-cert\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166209 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-node-log" (OuterVolumeSpecName: "node-log") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166235 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-kubelet\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-systemd\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-ovn-kubernetes\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166416 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166440 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-log-socket" (OuterVolumeSpecName: "log-socket") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166468 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-netns\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-ovn\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-openvswitch\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166563 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-env-overrides\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166621 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-slash\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166660 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-netd\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-var-lib-openvswitch\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-config\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.166792 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-bin\") pod \"290d31fb-b204-4b3c-84ea-a5d597748b18\" (UID: \"290d31fb-b204-4b3c-84ea-a5d597748b18\") " Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167291 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167296 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167320 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-slash" (OuterVolumeSpecName: "host-slash") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167398 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167627 4749 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-node-log\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167643 4749 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167654 4749 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167673 4749 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-log-socket\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167684 4749 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167693 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167702 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167710 4749 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167718 4749 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167729 4749 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167739 4749 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-slash\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167747 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167756 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.167765 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.168582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.168599 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.168622 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.173899 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6wgpf"] Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174215 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="northd" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174235 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="northd" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174251 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovn-controller" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174259 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovn-controller" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174270 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerName="pull" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174277 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerName="pull" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174289 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerName="util" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174299 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerName="util" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174312 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="sbdb" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174319 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="sbdb" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174347 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="nbdb" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174355 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="nbdb" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174365 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovn-acl-logging" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174372 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovn-acl-logging" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174383 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kube-rbac-proxy-node" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174390 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kube-rbac-proxy-node" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174398 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174405 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174414 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerName="extract" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174422 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerName="extract" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174434 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kubecfg-setup" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174441 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kubecfg-setup" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.174456 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovnkube-controller" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174464 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovnkube-controller" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174597 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovn-controller" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174615 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="sbdb" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174628 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovnkube-controller" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174642 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174655 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="northd" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174667 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="ovn-acl-logging" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174678 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="nbdb" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174689 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerName="kube-rbac-proxy-node" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.174700 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f22f1a6-aca7-47f1-9cf4-1218d6dd3931" containerName="extract" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.176702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.177816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290d31fb-b204-4b3c-84ea-a5d597748b18-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.178977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9" (OuterVolumeSpecName: "kube-api-access-8fjp9") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "kube-api-access-8fjp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.199122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "290d31fb-b204-4b3c-84ea-a5d597748b18" (UID: "290d31fb-b204-4b3c-84ea-a5d597748b18"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.217859 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z8jvg_bb6b5d81-5370-425d-a3f6-ebfc447a3d27/kube-multus/0.log" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.217944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z8jvg" event={"ID":"bb6b5d81-5370-425d-a3f6-ebfc447a3d27","Type":"ContainerStarted","Data":"98422df65bb1e6761490f513b386b22197140256f8d42054793f590992414243"} Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.223254 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wzvwl_290d31fb-b204-4b3c-84ea-a5d597748b18/ovn-acl-logging/0.log" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.224014 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wzvwl_290d31fb-b204-4b3c-84ea-a5d597748b18/ovn-controller/0.log" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.224687 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e" exitCode=0 Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.224763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e"} Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.224838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7"} Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.224864 4749 scope.go:117] "RemoveContainer" containerID="9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.224984 4749 generic.go:334] "Generic (PLEG): container finished" podID="290d31fb-b204-4b3c-84ea-a5d597748b18" containerID="4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7" exitCode=0 Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.225079 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.225086 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wzvwl" event={"ID":"290d31fb-b204-4b3c-84ea-a5d597748b18","Type":"ContainerDied","Data":"fdc77ac81bee03866949dd3e860090a1775d47ebce943b27146d074b8d25bdfc"} Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.247510 4749 scope.go:117] "RemoveContainer" containerID="1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.266837 4749 scope.go:117] "RemoveContainer" containerID="98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.272739 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wzvwl"] Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.273960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-ovnkube-script-lib\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-etc-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-ovn\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-cni-bin\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-var-lib-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-systemd-units\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-ovnkube-config\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-log-socket\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274312 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-cni-netd\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274371 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-run-ovn-kubernetes\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274404 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/81308a07-0346-4306-bbb3-d58d09fce643-ovn-node-metrics-cert\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-kubelet\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lcbw\" (UniqueName: \"kubernetes.io/projected/81308a07-0346-4306-bbb3-d58d09fce643-kube-api-access-9lcbw\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-slash\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274837 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-run-netns\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-env-overrides\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-node-log\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.274965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-systemd\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.275059 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.275074 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.275087 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fjp9\" (UniqueName: \"kubernetes.io/projected/290d31fb-b204-4b3c-84ea-a5d597748b18-kube-api-access-8fjp9\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.275099 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/290d31fb-b204-4b3c-84ea-a5d597748b18-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.275111 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/290d31fb-b204-4b3c-84ea-a5d597748b18-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.275121 4749 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/290d31fb-b204-4b3c-84ea-a5d597748b18-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.285737 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wzvwl"] Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.296352 4749 scope.go:117] "RemoveContainer" containerID="79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.321733 4749 scope.go:117] "RemoveContainer" containerID="c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.338438 4749 scope.go:117] "RemoveContainer" containerID="4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.356583 4749 scope.go:117] "RemoveContainer" containerID="88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376440 4749 scope.go:117] "RemoveContainer" containerID="45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-node-log\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376819 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-systemd\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-ovnkube-script-lib\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-etc-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-ovn\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-cni-bin\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-var-lib-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-node-log\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-systemd-units\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-ovnkube-config\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376982 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-log-socket\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377001 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-ovn\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-cni-netd\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-run-ovn-kubernetes\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/81308a07-0346-4306-bbb3-d58d09fce643-ovn-node-metrics-cert\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-systemd\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-cni-bin\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377149 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-kubelet\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-kubelet\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377180 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-run-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.376982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-etc-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lcbw\" (UniqueName: \"kubernetes.io/projected/81308a07-0346-4306-bbb3-d58d09fce643-kube-api-access-9lcbw\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-cni-netd\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377228 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-slash\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-log-socket\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-var-lib-openvswitch\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-systemd-units\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-run-ovn-kubernetes\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-slash\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-run-netns\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-env-overrides\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81308a07-0346-4306-bbb3-d58d09fce643-host-run-netns\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-ovnkube-script-lib\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-ovnkube-config\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.377948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/81308a07-0346-4306-bbb3-d58d09fce643-env-overrides\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.383718 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/81308a07-0346-4306-bbb3-d58d09fce643-ovn-node-metrics-cert\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.396903 4749 scope.go:117] "RemoveContainer" containerID="1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.397979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lcbw\" (UniqueName: \"kubernetes.io/projected/81308a07-0346-4306-bbb3-d58d09fce643-kube-api-access-9lcbw\") pod \"ovnkube-node-6wgpf\" (UID: \"81308a07-0346-4306-bbb3-d58d09fce643\") " pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.414948 4749 scope.go:117] "RemoveContainer" containerID="9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.415429 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc\": container with ID starting with 9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc not found: ID does not exist" containerID="9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.415489 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc"} err="failed to get container status \"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc\": rpc error: code = NotFound desc = could not find container \"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc\": container with ID starting with 9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.415525 4749 scope.go:117] "RemoveContainer" containerID="1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.415889 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397\": container with ID starting with 1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397 not found: ID does not exist" containerID="1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.415937 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397"} err="failed to get container status \"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397\": rpc error: code = NotFound desc = could not find container \"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397\": container with ID starting with 1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.415967 4749 scope.go:117] "RemoveContainer" containerID="98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.416216 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4\": container with ID starting with 98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4 not found: ID does not exist" containerID="98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.416246 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4"} err="failed to get container status \"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4\": rpc error: code = NotFound desc = could not find container \"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4\": container with ID starting with 98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.416266 4749 scope.go:117] "RemoveContainer" containerID="79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.416783 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1\": container with ID starting with 79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1 not found: ID does not exist" containerID="79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.416816 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1"} err="failed to get container status \"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1\": rpc error: code = NotFound desc = could not find container \"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1\": container with ID starting with 79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.416835 4749 scope.go:117] "RemoveContainer" containerID="c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.417095 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e\": container with ID starting with c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e not found: ID does not exist" containerID="c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.417140 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e"} err="failed to get container status \"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e\": rpc error: code = NotFound desc = could not find container \"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e\": container with ID starting with c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.417169 4749 scope.go:117] "RemoveContainer" containerID="4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.417480 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7\": container with ID starting with 4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7 not found: ID does not exist" containerID="4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.417507 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7"} err="failed to get container status \"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7\": rpc error: code = NotFound desc = could not find container \"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7\": container with ID starting with 4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.417527 4749 scope.go:117] "RemoveContainer" containerID="88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.417819 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf\": container with ID starting with 88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf not found: ID does not exist" containerID="88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.417837 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf"} err="failed to get container status \"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf\": rpc error: code = NotFound desc = could not find container \"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf\": container with ID starting with 88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.417850 4749 scope.go:117] "RemoveContainer" containerID="45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.418044 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4\": container with ID starting with 45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4 not found: ID does not exist" containerID="45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.418072 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4"} err="failed to get container status \"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4\": rpc error: code = NotFound desc = could not find container \"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4\": container with ID starting with 45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.418092 4749 scope.go:117] "RemoveContainer" containerID="1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906" Jan 28 18:46:19 crc kubenswrapper[4749]: E0128 18:46:19.418465 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906\": container with ID starting with 1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906 not found: ID does not exist" containerID="1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.418514 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906"} err="failed to get container status \"1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906\": rpc error: code = NotFound desc = could not find container \"1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906\": container with ID starting with 1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.418539 4749 scope.go:117] "RemoveContainer" containerID="9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.418805 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc"} err="failed to get container status \"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc\": rpc error: code = NotFound desc = could not find container \"9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc\": container with ID starting with 9916b213d77e10728745208d06e9619699c6d7c45e5aed823812e1e0bc52e0cc not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.418828 4749 scope.go:117] "RemoveContainer" containerID="1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.419114 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397"} err="failed to get container status \"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397\": rpc error: code = NotFound desc = could not find container \"1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397\": container with ID starting with 1062298c44958c9d96e89e5eb4c878829a3bd90c3a166ffbd43db78fca516397 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.419138 4749 scope.go:117] "RemoveContainer" containerID="98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.419403 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4"} err="failed to get container status \"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4\": rpc error: code = NotFound desc = could not find container \"98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4\": container with ID starting with 98143ccc91685bee3b16edd6d5c178dd47ad7f75188df584ea8761c710a76ba4 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.419430 4749 scope.go:117] "RemoveContainer" containerID="79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.419831 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1"} err="failed to get container status \"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1\": rpc error: code = NotFound desc = could not find container \"79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1\": container with ID starting with 79a8d853dc53fa47a8528054bac8227dd714287981d2025b5067f0f640a2cdb1 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.419856 4749 scope.go:117] "RemoveContainer" containerID="c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.420109 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e"} err="failed to get container status \"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e\": rpc error: code = NotFound desc = could not find container \"c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e\": container with ID starting with c15326694e5b927975b242463a7292f50cdb2b694444bae2a693239093dea02e not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.420133 4749 scope.go:117] "RemoveContainer" containerID="4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.420376 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7"} err="failed to get container status \"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7\": rpc error: code = NotFound desc = could not find container \"4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7\": container with ID starting with 4480bcd13c38dee25e9293c61d2e2ddee3806d215fde571bf65376174e5382a7 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.420410 4749 scope.go:117] "RemoveContainer" containerID="88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.420685 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf"} err="failed to get container status \"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf\": rpc error: code = NotFound desc = could not find container \"88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf\": container with ID starting with 88af47d7bd02cccdb8a5e428025b890ecec03186c0ac72c7dd9195df06cdc0cf not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.420712 4749 scope.go:117] "RemoveContainer" containerID="45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.420935 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4"} err="failed to get container status \"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4\": rpc error: code = NotFound desc = could not find container \"45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4\": container with ID starting with 45369f9ff65681220db59ca6c38c96bb159d0da7589ee23829f185d72f141ec4 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.420958 4749 scope.go:117] "RemoveContainer" containerID="1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.421180 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906"} err="failed to get container status \"1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906\": rpc error: code = NotFound desc = could not find container \"1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906\": container with ID starting with 1fc54d574d7276b80b58f4a8f31880822a2f0e889cc79af289d8e773ca3c6906 not found: ID does not exist" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.510295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.904744 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7"] Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.905492 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.909996 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.910170 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-4qwff" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.910307 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 28 18:46:19 crc kubenswrapper[4749]: I0128 18:46:19.986668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wst6\" (UniqueName: \"kubernetes.io/projected/2f8457f9-3010-4acc-88d1-97e5bec85c2c-kube-api-access-7wst6\") pod \"obo-prometheus-operator-68bc856cb9-hmgb7\" (UID: \"2f8457f9-3010-4acc-88d1-97e5bec85c2c\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.021086 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b"] Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.021984 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.023974 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qr57b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.025593 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.039930 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4"] Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.040866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.088025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f02f0f9-aa18-4986-af31-25b776f67fb7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b\" (UID: \"4f02f0f9-aa18-4986-af31-25b776f67fb7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.088080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wst6\" (UniqueName: \"kubernetes.io/projected/2f8457f9-3010-4acc-88d1-97e5bec85c2c-kube-api-access-7wst6\") pod \"obo-prometheus-operator-68bc856cb9-hmgb7\" (UID: \"2f8457f9-3010-4acc-88d1-97e5bec85c2c\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.088112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f02f0f9-aa18-4986-af31-25b776f67fb7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b\" (UID: \"4f02f0f9-aa18-4986-af31-25b776f67fb7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.088132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84cf86f4-9829-41b0-8151-028ef75f861e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4\" (UID: \"84cf86f4-9829-41b0-8151-028ef75f861e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.088163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84cf86f4-9829-41b0-8151-028ef75f861e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4\" (UID: \"84cf86f4-9829-41b0-8151-028ef75f861e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.112976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wst6\" (UniqueName: \"kubernetes.io/projected/2f8457f9-3010-4acc-88d1-97e5bec85c2c-kube-api-access-7wst6\") pod \"obo-prometheus-operator-68bc856cb9-hmgb7\" (UID: \"2f8457f9-3010-4acc-88d1-97e5bec85c2c\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.129732 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4kwb4"] Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.130649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.132562 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.132645 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-hlfb4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.189349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkccc\" (UniqueName: \"kubernetes.io/projected/502b92eb-ac87-456c-933a-7e9ff562e326-kube-api-access-bkccc\") pod \"observability-operator-59bdc8b94-4kwb4\" (UID: \"502b92eb-ac87-456c-933a-7e9ff562e326\") " pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.189418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/502b92eb-ac87-456c-933a-7e9ff562e326-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4kwb4\" (UID: \"502b92eb-ac87-456c-933a-7e9ff562e326\") " pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.189465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f02f0f9-aa18-4986-af31-25b776f67fb7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b\" (UID: \"4f02f0f9-aa18-4986-af31-25b776f67fb7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.189501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f02f0f9-aa18-4986-af31-25b776f67fb7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b\" (UID: \"4f02f0f9-aa18-4986-af31-25b776f67fb7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.189526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84cf86f4-9829-41b0-8151-028ef75f861e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4\" (UID: \"84cf86f4-9829-41b0-8151-028ef75f861e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.189568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84cf86f4-9829-41b0-8151-028ef75f861e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4\" (UID: \"84cf86f4-9829-41b0-8151-028ef75f861e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.195536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f02f0f9-aa18-4986-af31-25b776f67fb7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b\" (UID: \"4f02f0f9-aa18-4986-af31-25b776f67fb7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.196794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84cf86f4-9829-41b0-8151-028ef75f861e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4\" (UID: \"84cf86f4-9829-41b0-8151-028ef75f861e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.198834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84cf86f4-9829-41b0-8151-028ef75f861e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4\" (UID: \"84cf86f4-9829-41b0-8151-028ef75f861e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.199955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f02f0f9-aa18-4986-af31-25b776f67fb7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b\" (UID: \"4f02f0f9-aa18-4986-af31-25b776f67fb7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.232242 4749 generic.go:334] "Generic (PLEG): container finished" podID="81308a07-0346-4306-bbb3-d58d09fce643" containerID="d229fb070caa9151d4cfbf49c5d9f4d162938c232709312dba442f3e520ddc45" exitCode=0 Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.232296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerDied","Data":"d229fb070caa9151d4cfbf49c5d9f4d162938c232709312dba442f3e520ddc45"} Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.232357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"afe6d8f0687f99f5a09297e080e684f0255e4017fbccd5bfda5c05a7413c9475"} Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.272193 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-c6vlq"] Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.272946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.277858 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-8ml69" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.281122 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.291749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsgl6\" (UniqueName: \"kubernetes.io/projected/45c3f22c-a523-4e94-858c-97bdb2705b9e-kube-api-access-rsgl6\") pod \"perses-operator-5bf474d74f-c6vlq\" (UID: \"45c3f22c-a523-4e94-858c-97bdb2705b9e\") " pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.291800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkccc\" (UniqueName: \"kubernetes.io/projected/502b92eb-ac87-456c-933a-7e9ff562e326-kube-api-access-bkccc\") pod \"observability-operator-59bdc8b94-4kwb4\" (UID: \"502b92eb-ac87-456c-933a-7e9ff562e326\") " pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.291839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/502b92eb-ac87-456c-933a-7e9ff562e326-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4kwb4\" (UID: \"502b92eb-ac87-456c-933a-7e9ff562e326\") " pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.291871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/45c3f22c-a523-4e94-858c-97bdb2705b9e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-c6vlq\" (UID: \"45c3f22c-a523-4e94-858c-97bdb2705b9e\") " pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.299140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/502b92eb-ac87-456c-933a-7e9ff562e326-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4kwb4\" (UID: \"502b92eb-ac87-456c-933a-7e9ff562e326\") " pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.317219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkccc\" (UniqueName: \"kubernetes.io/projected/502b92eb-ac87-456c-933a-7e9ff562e326-kube-api-access-bkccc\") pod \"observability-operator-59bdc8b94-4kwb4\" (UID: \"502b92eb-ac87-456c-933a-7e9ff562e326\") " pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.323280 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators_2f8457f9-3010-4acc-88d1-97e5bec85c2c_0(7c13582f28e3a28e5b699d8165bc0e1c6721b35caca9919827b7faa314ab1ead): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.323378 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators_2f8457f9-3010-4acc-88d1-97e5bec85c2c_0(7c13582f28e3a28e5b699d8165bc0e1c6721b35caca9919827b7faa314ab1ead): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.323405 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators_2f8457f9-3010-4acc-88d1-97e5bec85c2c_0(7c13582f28e3a28e5b699d8165bc0e1c6721b35caca9919827b7faa314ab1ead): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.323460 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators(2f8457f9-3010-4acc-88d1-97e5bec85c2c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators(2f8457f9-3010-4acc-88d1-97e5bec85c2c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators_2f8457f9-3010-4acc-88d1-97e5bec85c2c_0(7c13582f28e3a28e5b699d8165bc0e1c6721b35caca9919827b7faa314ab1ead): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" podUID="2f8457f9-3010-4acc-88d1-97e5bec85c2c" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.339791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.356879 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.384530 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators_4f02f0f9-aa18-4986-af31-25b776f67fb7_0(1a63818a5f3cbbc644124d721039fe66a64bcddb349989f57ff41ded53ee52fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.384858 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators_4f02f0f9-aa18-4986-af31-25b776f67fb7_0(1a63818a5f3cbbc644124d721039fe66a64bcddb349989f57ff41ded53ee52fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.384887 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators_4f02f0f9-aa18-4986-af31-25b776f67fb7_0(1a63818a5f3cbbc644124d721039fe66a64bcddb349989f57ff41ded53ee52fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.384937 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators(4f02f0f9-aa18-4986-af31-25b776f67fb7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators(4f02f0f9-aa18-4986-af31-25b776f67fb7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators_4f02f0f9-aa18-4986-af31-25b776f67fb7_0(1a63818a5f3cbbc644124d721039fe66a64bcddb349989f57ff41ded53ee52fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" podUID="4f02f0f9-aa18-4986-af31-25b776f67fb7" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.390745 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators_84cf86f4-9829-41b0-8151-028ef75f861e_0(793546172a29706abacc64a79bca17f76f7c8153123d112648c2246419689a95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.390852 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators_84cf86f4-9829-41b0-8151-028ef75f861e_0(793546172a29706abacc64a79bca17f76f7c8153123d112648c2246419689a95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.390880 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators_84cf86f4-9829-41b0-8151-028ef75f861e_0(793546172a29706abacc64a79bca17f76f7c8153123d112648c2246419689a95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.390944 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators(84cf86f4-9829-41b0-8151-028ef75f861e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators(84cf86f4-9829-41b0-8151-028ef75f861e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators_84cf86f4-9829-41b0-8151-028ef75f861e_0(793546172a29706abacc64a79bca17f76f7c8153123d112648c2246419689a95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" podUID="84cf86f4-9829-41b0-8151-028ef75f861e" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.392706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsgl6\" (UniqueName: \"kubernetes.io/projected/45c3f22c-a523-4e94-858c-97bdb2705b9e-kube-api-access-rsgl6\") pod \"perses-operator-5bf474d74f-c6vlq\" (UID: \"45c3f22c-a523-4e94-858c-97bdb2705b9e\") " pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.392761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/45c3f22c-a523-4e94-858c-97bdb2705b9e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-c6vlq\" (UID: \"45c3f22c-a523-4e94-858c-97bdb2705b9e\") " pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.393791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/45c3f22c-a523-4e94-858c-97bdb2705b9e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-c6vlq\" (UID: \"45c3f22c-a523-4e94-858c-97bdb2705b9e\") " pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.415056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsgl6\" (UniqueName: \"kubernetes.io/projected/45c3f22c-a523-4e94-858c-97bdb2705b9e-kube-api-access-rsgl6\") pod \"perses-operator-5bf474d74f-c6vlq\" (UID: \"45c3f22c-a523-4e94-858c-97bdb2705b9e\") " pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.449972 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.486785 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4kwb4_openshift-operators_502b92eb-ac87-456c-933a-7e9ff562e326_0(943b8df0a17406421ab01b1bfc8b7ce8c4143692419f5be04ad1f1aea72ebb77): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.486871 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4kwb4_openshift-operators_502b92eb-ac87-456c-933a-7e9ff562e326_0(943b8df0a17406421ab01b1bfc8b7ce8c4143692419f5be04ad1f1aea72ebb77): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.486895 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4kwb4_openshift-operators_502b92eb-ac87-456c-933a-7e9ff562e326_0(943b8df0a17406421ab01b1bfc8b7ce8c4143692419f5be04ad1f1aea72ebb77): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.486941 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-4kwb4_openshift-operators(502b92eb-ac87-456c-933a-7e9ff562e326)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-4kwb4_openshift-operators(502b92eb-ac87-456c-933a-7e9ff562e326)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4kwb4_openshift-operators_502b92eb-ac87-456c-933a-7e9ff562e326_0(943b8df0a17406421ab01b1bfc8b7ce8c4143692419f5be04ad1f1aea72ebb77): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" podUID="502b92eb-ac87-456c-933a-7e9ff562e326" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.594075 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.625243 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-c6vlq_openshift-operators_45c3f22c-a523-4e94-858c-97bdb2705b9e_0(53c36e18933d9dd54969b75b06726e70486e183ca6c1baf67374b15bd3e3ee22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.625308 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-c6vlq_openshift-operators_45c3f22c-a523-4e94-858c-97bdb2705b9e_0(53c36e18933d9dd54969b75b06726e70486e183ca6c1baf67374b15bd3e3ee22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.625343 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-c6vlq_openshift-operators_45c3f22c-a523-4e94-858c-97bdb2705b9e_0(53c36e18933d9dd54969b75b06726e70486e183ca6c1baf67374b15bd3e3ee22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:20 crc kubenswrapper[4749]: E0128 18:46:20.625393 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-c6vlq_openshift-operators(45c3f22c-a523-4e94-858c-97bdb2705b9e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-c6vlq_openshift-operators(45c3f22c-a523-4e94-858c-97bdb2705b9e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-c6vlq_openshift-operators_45c3f22c-a523-4e94-858c-97bdb2705b9e_0(53c36e18933d9dd54969b75b06726e70486e183ca6c1baf67374b15bd3e3ee22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" podUID="45c3f22c-a523-4e94-858c-97bdb2705b9e" Jan 28 18:46:20 crc kubenswrapper[4749]: I0128 18:46:20.879284 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290d31fb-b204-4b3c-84ea-a5d597748b18" path="/var/lib/kubelet/pods/290d31fb-b204-4b3c-84ea-a5d597748b18/volumes" Jan 28 18:46:21 crc kubenswrapper[4749]: I0128 18:46:21.239831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"f93336107dc0d7222624ceab328483f11f0d3130c27239eea5ab231e5551f328"} Jan 28 18:46:21 crc kubenswrapper[4749]: I0128 18:46:21.240142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"9eabd3e3d335eedb5fe3a3d0eab8004254f909ba4610dc74b8e5427b55f7ad6a"} Jan 28 18:46:21 crc kubenswrapper[4749]: I0128 18:46:21.240158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"2e111e7c6d17d09d225fc8658f2614a5d6c2938312134dc82de3b0bdcc512a6f"} Jan 28 18:46:21 crc kubenswrapper[4749]: I0128 18:46:21.240168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"d9f7cc96e84f5ad51f057f426590599c5bf072b5132ee01c92e730797c931fb3"} Jan 28 18:46:21 crc kubenswrapper[4749]: I0128 18:46:21.240177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"5875cce9a215696981e83e26bd8cb0667f743c95cd665bd0285fe9971b6e8b80"} Jan 28 18:46:21 crc kubenswrapper[4749]: I0128 18:46:21.240184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"0345a8dfd9be14c61c27f51a9d61d4785f01a8bc83e9a74b98127135c04ed2d7"} Jan 28 18:46:23 crc kubenswrapper[4749]: I0128 18:46:23.269736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"79cdf9f2cfc1ee1e5f22ed10076877e89d09e39bb694a7b0e1c0981bcdf6fa1a"} Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.146030 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b"] Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.146713 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.147228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.157297 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-c6vlq"] Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.157484 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.158108 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.200634 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7"] Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.200849 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.201434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.256101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4kwb4"] Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.256236 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.256716 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.259718 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators_4f02f0f9-aa18-4986-af31-25b776f67fb7_0(f19d2c2d33f105311a3ba539aaa7ad6fbf92915daa9818d0d078c9bdb900c607): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.259823 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators_4f02f0f9-aa18-4986-af31-25b776f67fb7_0(f19d2c2d33f105311a3ba539aaa7ad6fbf92915daa9818d0d078c9bdb900c607): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.259858 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators_4f02f0f9-aa18-4986-af31-25b776f67fb7_0(f19d2c2d33f105311a3ba539aaa7ad6fbf92915daa9818d0d078c9bdb900c607): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.259933 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators(4f02f0f9-aa18-4986-af31-25b776f67fb7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators(4f02f0f9-aa18-4986-af31-25b776f67fb7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_openshift-operators_4f02f0f9-aa18-4986-af31-25b776f67fb7_0(f19d2c2d33f105311a3ba539aaa7ad6fbf92915daa9818d0d078c9bdb900c607): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" podUID="4f02f0f9-aa18-4986-af31-25b776f67fb7" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.263552 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4"] Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.263712 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.264288 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.282489 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators_2f8457f9-3010-4acc-88d1-97e5bec85c2c_0(2a9d52abd1727fbfc18f71fa15ab21b0dd7dc73337edbc716774b8cee33e14a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.282556 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators_2f8457f9-3010-4acc-88d1-97e5bec85c2c_0(2a9d52abd1727fbfc18f71fa15ab21b0dd7dc73337edbc716774b8cee33e14a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.282578 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators_2f8457f9-3010-4acc-88d1-97e5bec85c2c_0(2a9d52abd1727fbfc18f71fa15ab21b0dd7dc73337edbc716774b8cee33e14a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.282645 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators(2f8457f9-3010-4acc-88d1-97e5bec85c2c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators(2f8457f9-3010-4acc-88d1-97e5bec85c2c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-hmgb7_openshift-operators_2f8457f9-3010-4acc-88d1-97e5bec85c2c_0(2a9d52abd1727fbfc18f71fa15ab21b0dd7dc73337edbc716774b8cee33e14a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" podUID="2f8457f9-3010-4acc-88d1-97e5bec85c2c" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.285209 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-c6vlq_openshift-operators_45c3f22c-a523-4e94-858c-97bdb2705b9e_0(87e84386d292263a0dcc6fe8c69b9b8f262f3ed25a61fdd48772dff1e4dd7a63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.285244 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-c6vlq_openshift-operators_45c3f22c-a523-4e94-858c-97bdb2705b9e_0(87e84386d292263a0dcc6fe8c69b9b8f262f3ed25a61fdd48772dff1e4dd7a63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.285263 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-c6vlq_openshift-operators_45c3f22c-a523-4e94-858c-97bdb2705b9e_0(87e84386d292263a0dcc6fe8c69b9b8f262f3ed25a61fdd48772dff1e4dd7a63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.285291 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-c6vlq_openshift-operators(45c3f22c-a523-4e94-858c-97bdb2705b9e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-c6vlq_openshift-operators(45c3f22c-a523-4e94-858c-97bdb2705b9e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-c6vlq_openshift-operators_45c3f22c-a523-4e94-858c-97bdb2705b9e_0(87e84386d292263a0dcc6fe8c69b9b8f262f3ed25a61fdd48772dff1e4dd7a63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" podUID="45c3f22c-a523-4e94-858c-97bdb2705b9e" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.303415 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators_84cf86f4-9829-41b0-8151-028ef75f861e_0(41b6cc26e849f4455d177ab1a4f41883250f55bcec7a5e7f016c053e96e3d486): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.303462 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators_84cf86f4-9829-41b0-8151-028ef75f861e_0(41b6cc26e849f4455d177ab1a4f41883250f55bcec7a5e7f016c053e96e3d486): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.303482 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators_84cf86f4-9829-41b0-8151-028ef75f861e_0(41b6cc26e849f4455d177ab1a4f41883250f55bcec7a5e7f016c053e96e3d486): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.303521 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators(84cf86f4-9829-41b0-8151-028ef75f861e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators(84cf86f4-9829-41b0-8151-028ef75f861e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_openshift-operators_84cf86f4-9829-41b0-8151-028ef75f861e_0(41b6cc26e849f4455d177ab1a4f41883250f55bcec7a5e7f016c053e96e3d486): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" podUID="84cf86f4-9829-41b0-8151-028ef75f861e" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.306983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" event={"ID":"81308a07-0346-4306-bbb3-d58d09fce643","Type":"ContainerStarted","Data":"e353e8ae0b99e4b5a47c78fe85c4bd506af58c14079af1af0ef0c07367f0cb35"} Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.308373 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.308401 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.308447 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.373726 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.374724 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4kwb4_openshift-operators_502b92eb-ac87-456c-933a-7e9ff562e326_0(ecab38b3bd58be62aa7d38cffa8c5379b0c22e2a69a8b78f986e79aa55c9828e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.374791 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4kwb4_openshift-operators_502b92eb-ac87-456c-933a-7e9ff562e326_0(ecab38b3bd58be62aa7d38cffa8c5379b0c22e2a69a8b78f986e79aa55c9828e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.374812 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4kwb4_openshift-operators_502b92eb-ac87-456c-933a-7e9ff562e326_0(ecab38b3bd58be62aa7d38cffa8c5379b0c22e2a69a8b78f986e79aa55c9828e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:27 crc kubenswrapper[4749]: E0128 18:46:27.374851 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-4kwb4_openshift-operators(502b92eb-ac87-456c-933a-7e9ff562e326)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-4kwb4_openshift-operators(502b92eb-ac87-456c-933a-7e9ff562e326)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4kwb4_openshift-operators_502b92eb-ac87-456c-933a-7e9ff562e326_0(ecab38b3bd58be62aa7d38cffa8c5379b0c22e2a69a8b78f986e79aa55c9828e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" podUID="502b92eb-ac87-456c-933a-7e9ff562e326" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.381623 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:27 crc kubenswrapper[4749]: I0128 18:46:27.437307 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" podStartSLOduration=8.437272477 podStartE2EDuration="8.437272477s" podCreationTimestamp="2026-01-28 18:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:46:27.370127807 +0000 UTC m=+655.381654602" watchObservedRunningTime="2026-01-28 18:46:27.437272477 +0000 UTC m=+655.448799252" Jan 28 18:46:28 crc kubenswrapper[4749]: I0128 18:46:28.475921 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 18:46:38 crc kubenswrapper[4749]: I0128 18:46:38.871503 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:38 crc kubenswrapper[4749]: I0128 18:46:38.872436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" Jan 28 18:46:39 crc kubenswrapper[4749]: I0128 18:46:39.131632 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4"] Jan 28 18:46:39 crc kubenswrapper[4749]: I0128 18:46:39.369175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" event={"ID":"84cf86f4-9829-41b0-8151-028ef75f861e","Type":"ContainerStarted","Data":"95b126466b188cebfaad71992baa7e475534c5d6f509109c216d851830f8ae54"} Jan 28 18:46:40 crc kubenswrapper[4749]: I0128 18:46:40.871025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:40 crc kubenswrapper[4749]: I0128 18:46:40.871472 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:41 crc kubenswrapper[4749]: I0128 18:46:41.336439 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-c6vlq"] Jan 28 18:46:41 crc kubenswrapper[4749]: I0128 18:46:41.872976 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:41 crc kubenswrapper[4749]: I0128 18:46:41.873065 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:41 crc kubenswrapper[4749]: I0128 18:46:41.873759 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" Jan 28 18:46:41 crc kubenswrapper[4749]: I0128 18:46:41.873891 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:42 crc kubenswrapper[4749]: I0128 18:46:42.871079 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:42 crc kubenswrapper[4749]: I0128 18:46:42.878253 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" Jan 28 18:46:43 crc kubenswrapper[4749]: I0128 18:46:43.040495 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7"] Jan 28 18:46:43 crc kubenswrapper[4749]: W0128 18:46:43.046687 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f8457f9_3010_4acc_88d1_97e5bec85c2c.slice/crio-4a01bf55e43d9ce23427ef5f0bc4a4ce3e11b27f1a9724ec79b79cacc697867e WatchSource:0}: Error finding container 4a01bf55e43d9ce23427ef5f0bc4a4ce3e11b27f1a9724ec79b79cacc697867e: Status 404 returned error can't find the container with id 4a01bf55e43d9ce23427ef5f0bc4a4ce3e11b27f1a9724ec79b79cacc697867e Jan 28 18:46:43 crc kubenswrapper[4749]: I0128 18:46:43.360309 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4kwb4"] Jan 28 18:46:43 crc kubenswrapper[4749]: W0128 18:46:43.364992 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod502b92eb_ac87_456c_933a_7e9ff562e326.slice/crio-d014d68f7ad7ab9ad56fe7c4b20e833df6438cd2c77f822ffb893a9816dc66f7 WatchSource:0}: Error finding container d014d68f7ad7ab9ad56fe7c4b20e833df6438cd2c77f822ffb893a9816dc66f7: Status 404 returned error can't find the container with id d014d68f7ad7ab9ad56fe7c4b20e833df6438cd2c77f822ffb893a9816dc66f7 Jan 28 18:46:43 crc kubenswrapper[4749]: I0128 18:46:43.407610 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b"] Jan 28 18:46:43 crc kubenswrapper[4749]: I0128 18:46:43.410954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" event={"ID":"502b92eb-ac87-456c-933a-7e9ff562e326","Type":"ContainerStarted","Data":"d014d68f7ad7ab9ad56fe7c4b20e833df6438cd2c77f822ffb893a9816dc66f7"} Jan 28 18:46:43 crc kubenswrapper[4749]: I0128 18:46:43.415136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" event={"ID":"84cf86f4-9829-41b0-8151-028ef75f861e","Type":"ContainerStarted","Data":"7f9286859f60f811309dbd8c2d7d7fd7c0953d66fd73d0da8275863b545c2021"} Jan 28 18:46:43 crc kubenswrapper[4749]: I0128 18:46:43.417104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" event={"ID":"2f8457f9-3010-4acc-88d1-97e5bec85c2c","Type":"ContainerStarted","Data":"4a01bf55e43d9ce23427ef5f0bc4a4ce3e11b27f1a9724ec79b79cacc697867e"} Jan 28 18:46:43 crc kubenswrapper[4749]: I0128 18:46:43.419374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" event={"ID":"45c3f22c-a523-4e94-858c-97bdb2705b9e","Type":"ContainerStarted","Data":"7fabb92c8e43bf4fdbe1e7ac8766dac7c24b8064dea79c723391f53b095699ac"} Jan 28 18:46:43 crc kubenswrapper[4749]: I0128 18:46:43.448999 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4" podStartSLOduration=19.715172661 podStartE2EDuration="23.448973492s" podCreationTimestamp="2026-01-28 18:46:20 +0000 UTC" firstStartedPulling="2026-01-28 18:46:39.144765724 +0000 UTC m=+667.156292489" lastFinishedPulling="2026-01-28 18:46:42.878566545 +0000 UTC m=+670.890093320" observedRunningTime="2026-01-28 18:46:43.430577992 +0000 UTC m=+671.442104767" watchObservedRunningTime="2026-01-28 18:46:43.448973492 +0000 UTC m=+671.460500267" Jan 28 18:46:44 crc kubenswrapper[4749]: I0128 18:46:44.428190 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" event={"ID":"4f02f0f9-aa18-4986-af31-25b776f67fb7","Type":"ContainerStarted","Data":"d8f405ce261cc762ae613104058ae7401643f4354db25603bf40a590cdfc352d"} Jan 28 18:46:44 crc kubenswrapper[4749]: I0128 18:46:44.429195 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" event={"ID":"4f02f0f9-aa18-4986-af31-25b776f67fb7","Type":"ContainerStarted","Data":"9ea9ee520511d5f59087321c6b2dfbda7680ae73ec9f1a4ffd522d2992f73e8f"} Jan 28 18:46:44 crc kubenswrapper[4749]: I0128 18:46:44.452537 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b" podStartSLOduration=24.452513822 podStartE2EDuration="24.452513822s" podCreationTimestamp="2026-01-28 18:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:46:44.447389337 +0000 UTC m=+672.458916132" watchObservedRunningTime="2026-01-28 18:46:44.452513822 +0000 UTC m=+672.464040597" Jan 28 18:46:47 crc kubenswrapper[4749]: I0128 18:46:47.446421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" event={"ID":"2f8457f9-3010-4acc-88d1-97e5bec85c2c","Type":"ContainerStarted","Data":"30bd3637df4124e8525cd1a07feadb64c4be92472a489f0b06fe10b931ee47e8"} Jan 28 18:46:47 crc kubenswrapper[4749]: I0128 18:46:47.448612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" event={"ID":"45c3f22c-a523-4e94-858c-97bdb2705b9e","Type":"ContainerStarted","Data":"218da8d520cd5b95ad8cff76ca249e12b5c36e730ae780d97213187fd765c5dc"} Jan 28 18:46:47 crc kubenswrapper[4749]: I0128 18:46:47.448747 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:46:47 crc kubenswrapper[4749]: I0128 18:46:47.466596 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hmgb7" podStartSLOduration=25.152632275 podStartE2EDuration="28.466568417s" podCreationTimestamp="2026-01-28 18:46:19 +0000 UTC" firstStartedPulling="2026-01-28 18:46:43.050738092 +0000 UTC m=+671.062264867" lastFinishedPulling="2026-01-28 18:46:46.364674234 +0000 UTC m=+674.376201009" observedRunningTime="2026-01-28 18:46:47.460886908 +0000 UTC m=+675.472413703" watchObservedRunningTime="2026-01-28 18:46:47.466568417 +0000 UTC m=+675.478095202" Jan 28 18:46:47 crc kubenswrapper[4749]: I0128 18:46:47.479599 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" podStartSLOduration=23.561466261 podStartE2EDuration="27.479578915s" podCreationTimestamp="2026-01-28 18:46:20 +0000 UTC" firstStartedPulling="2026-01-28 18:46:42.433381847 +0000 UTC m=+670.444908622" lastFinishedPulling="2026-01-28 18:46:46.351494501 +0000 UTC m=+674.363021276" observedRunningTime="2026-01-28 18:46:47.475834493 +0000 UTC m=+675.487361278" watchObservedRunningTime="2026-01-28 18:46:47.479578915 +0000 UTC m=+675.491105690" Jan 28 18:46:49 crc kubenswrapper[4749]: I0128 18:46:49.536545 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6wgpf" Jan 28 18:46:52 crc kubenswrapper[4749]: I0128 18:46:52.499022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" event={"ID":"502b92eb-ac87-456c-933a-7e9ff562e326","Type":"ContainerStarted","Data":"67e09e7c7d9c6ccddf40f0f8da95eb2fb86de410b971eb4e61d49a305987c905"} Jan 28 18:46:52 crc kubenswrapper[4749]: I0128 18:46:52.499391 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:52 crc kubenswrapper[4749]: I0128 18:46:52.502079 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" Jan 28 18:46:52 crc kubenswrapper[4749]: I0128 18:46:52.522263 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-4kwb4" podStartSLOduration=23.926403558 podStartE2EDuration="32.522217865s" podCreationTimestamp="2026-01-28 18:46:20 +0000 UTC" firstStartedPulling="2026-01-28 18:46:43.36744005 +0000 UTC m=+671.378966835" lastFinishedPulling="2026-01-28 18:46:51.963254367 +0000 UTC m=+679.974781142" observedRunningTime="2026-01-28 18:46:52.512796695 +0000 UTC m=+680.524323480" watchObservedRunningTime="2026-01-28 18:46:52.522217865 +0000 UTC m=+680.533764551" Jan 28 18:46:57 crc kubenswrapper[4749]: I0128 18:46:57.467271 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:46:57 crc kubenswrapper[4749]: I0128 18:46:57.468066 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.322493 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh"] Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.324608 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.328158 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.328258 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8585q" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.328174 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.335528 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-h9mw4"] Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.336505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-h9mw4" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.341038 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-9tptt" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.341581 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh"] Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.351594 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-h9mw4"] Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.380428 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-mhc52"] Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.381607 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.385981 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-96wnk" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.394691 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-mhc52"] Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.433625 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hflcx\" (UniqueName: \"kubernetes.io/projected/ffb56991-41fd-4ca3-9aca-5577ff399534-kube-api-access-hflcx\") pod \"cert-manager-858654f9db-h9mw4\" (UID: \"ffb56991-41fd-4ca3-9aca-5577ff399534\") " pod="cert-manager/cert-manager-858654f9db-h9mw4" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.433767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpstj\" (UniqueName: \"kubernetes.io/projected/3b3b2f85-7b19-4bc6-8618-885d114ed3d3-kube-api-access-cpstj\") pod \"cert-manager-cainjector-cf98fcc89-rdfhh\" (UID: \"3b3b2f85-7b19-4bc6-8618-885d114ed3d3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.534788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpstj\" (UniqueName: \"kubernetes.io/projected/3b3b2f85-7b19-4bc6-8618-885d114ed3d3-kube-api-access-cpstj\") pod \"cert-manager-cainjector-cf98fcc89-rdfhh\" (UID: \"3b3b2f85-7b19-4bc6-8618-885d114ed3d3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.534871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hflcx\" (UniqueName: \"kubernetes.io/projected/ffb56991-41fd-4ca3-9aca-5577ff399534-kube-api-access-hflcx\") pod \"cert-manager-858654f9db-h9mw4\" (UID: \"ffb56991-41fd-4ca3-9aca-5577ff399534\") " pod="cert-manager/cert-manager-858654f9db-h9mw4" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.534953 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l79pp\" (UniqueName: \"kubernetes.io/projected/9cdb06ed-9c63-4f38-9276-42339904fdd0-kube-api-access-l79pp\") pod \"cert-manager-webhook-687f57d79b-mhc52\" (UID: \"9cdb06ed-9c63-4f38-9276-42339904fdd0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.557408 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hflcx\" (UniqueName: \"kubernetes.io/projected/ffb56991-41fd-4ca3-9aca-5577ff399534-kube-api-access-hflcx\") pod \"cert-manager-858654f9db-h9mw4\" (UID: \"ffb56991-41fd-4ca3-9aca-5577ff399534\") " pod="cert-manager/cert-manager-858654f9db-h9mw4" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.557994 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpstj\" (UniqueName: \"kubernetes.io/projected/3b3b2f85-7b19-4bc6-8618-885d114ed3d3-kube-api-access-cpstj\") pod \"cert-manager-cainjector-cf98fcc89-rdfhh\" (UID: \"3b3b2f85-7b19-4bc6-8618-885d114ed3d3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.599034 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.636098 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l79pp\" (UniqueName: \"kubernetes.io/projected/9cdb06ed-9c63-4f38-9276-42339904fdd0-kube-api-access-l79pp\") pod \"cert-manager-webhook-687f57d79b-mhc52\" (UID: \"9cdb06ed-9c63-4f38-9276-42339904fdd0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.663196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l79pp\" (UniqueName: \"kubernetes.io/projected/9cdb06ed-9c63-4f38-9276-42339904fdd0-kube-api-access-l79pp\") pod \"cert-manager-webhook-687f57d79b-mhc52\" (UID: \"9cdb06ed-9c63-4f38-9276-42339904fdd0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.664657 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.676866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-h9mw4" Jan 28 18:47:00 crc kubenswrapper[4749]: I0128 18:47:00.698788 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" Jan 28 18:47:01 crc kubenswrapper[4749]: I0128 18:47:01.157764 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh"] Jan 28 18:47:01 crc kubenswrapper[4749]: I0128 18:47:01.214466 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-h9mw4"] Jan 28 18:47:01 crc kubenswrapper[4749]: I0128 18:47:01.220474 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-mhc52"] Jan 28 18:47:01 crc kubenswrapper[4749]: W0128 18:47:01.225164 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffb56991_41fd_4ca3_9aca_5577ff399534.slice/crio-e74feccc89eb68cd004949f7aa92c5317fe75f0a847ebea2dd8aefbcd016d694 WatchSource:0}: Error finding container e74feccc89eb68cd004949f7aa92c5317fe75f0a847ebea2dd8aefbcd016d694: Status 404 returned error can't find the container with id e74feccc89eb68cd004949f7aa92c5317fe75f0a847ebea2dd8aefbcd016d694 Jan 28 18:47:01 crc kubenswrapper[4749]: W0128 18:47:01.228750 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdb06ed_9c63_4f38_9276_42339904fdd0.slice/crio-999731df0ff5f571340be935cd45fcfb31e12da29c8830f1591a872108201bce WatchSource:0}: Error finding container 999731df0ff5f571340be935cd45fcfb31e12da29c8830f1591a872108201bce: Status 404 returned error can't find the container with id 999731df0ff5f571340be935cd45fcfb31e12da29c8830f1591a872108201bce Jan 28 18:47:01 crc kubenswrapper[4749]: I0128 18:47:01.552150 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-h9mw4" event={"ID":"ffb56991-41fd-4ca3-9aca-5577ff399534","Type":"ContainerStarted","Data":"e74feccc89eb68cd004949f7aa92c5317fe75f0a847ebea2dd8aefbcd016d694"} Jan 28 18:47:01 crc kubenswrapper[4749]: I0128 18:47:01.553816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh" event={"ID":"3b3b2f85-7b19-4bc6-8618-885d114ed3d3","Type":"ContainerStarted","Data":"aa136bbad6ecb45941cfb30e3013dc772711685ba3d00681521c3ea61ba11c19"} Jan 28 18:47:01 crc kubenswrapper[4749]: I0128 18:47:01.555631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" event={"ID":"9cdb06ed-9c63-4f38-9276-42339904fdd0","Type":"ContainerStarted","Data":"999731df0ff5f571340be935cd45fcfb31e12da29c8830f1591a872108201bce"} Jan 28 18:47:07 crc kubenswrapper[4749]: I0128 18:47:07.620170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" event={"ID":"9cdb06ed-9c63-4f38-9276-42339904fdd0","Type":"ContainerStarted","Data":"d54cc939490903cc0930bc014ce1aee723c37d39d05fca82f0e4740a3ea31322"} Jan 28 18:47:07 crc kubenswrapper[4749]: I0128 18:47:07.620747 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" Jan 28 18:47:07 crc kubenswrapper[4749]: I0128 18:47:07.621701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-h9mw4" event={"ID":"ffb56991-41fd-4ca3-9aca-5577ff399534","Type":"ContainerStarted","Data":"766607d2ab38fa34efad71ae2ad3e9ce4e3cddcece3c309751d33dffc6c33022"} Jan 28 18:47:07 crc kubenswrapper[4749]: I0128 18:47:07.623996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh" event={"ID":"3b3b2f85-7b19-4bc6-8618-885d114ed3d3","Type":"ContainerStarted","Data":"d0bff7e70453971082ef5a6109461563294263e9b46c341b7bea805acc1783b3"} Jan 28 18:47:07 crc kubenswrapper[4749]: I0128 18:47:07.635442 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" podStartSLOduration=1.923974574 podStartE2EDuration="7.635417466s" podCreationTimestamp="2026-01-28 18:47:00 +0000 UTC" firstStartedPulling="2026-01-28 18:47:01.230381687 +0000 UTC m=+689.241908462" lastFinishedPulling="2026-01-28 18:47:06.941824579 +0000 UTC m=+694.953351354" observedRunningTime="2026-01-28 18:47:07.635351485 +0000 UTC m=+695.646878280" watchObservedRunningTime="2026-01-28 18:47:07.635417466 +0000 UTC m=+695.646944251" Jan 28 18:47:07 crc kubenswrapper[4749]: I0128 18:47:07.652605 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-h9mw4" podStartSLOduration=1.9345355419999999 podStartE2EDuration="7.652584445s" podCreationTimestamp="2026-01-28 18:47:00 +0000 UTC" firstStartedPulling="2026-01-28 18:47:01.227963918 +0000 UTC m=+689.239490693" lastFinishedPulling="2026-01-28 18:47:06.946012821 +0000 UTC m=+694.957539596" observedRunningTime="2026-01-28 18:47:07.650798292 +0000 UTC m=+695.662325087" watchObservedRunningTime="2026-01-28 18:47:07.652584445 +0000 UTC m=+695.664111220" Jan 28 18:47:07 crc kubenswrapper[4749]: I0128 18:47:07.671370 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rdfhh" podStartSLOduration=1.905287757 podStartE2EDuration="7.671347803s" podCreationTimestamp="2026-01-28 18:47:00 +0000 UTC" firstStartedPulling="2026-01-28 18:47:01.167530891 +0000 UTC m=+689.179057666" lastFinishedPulling="2026-01-28 18:47:06.933590937 +0000 UTC m=+694.945117712" observedRunningTime="2026-01-28 18:47:07.667579192 +0000 UTC m=+695.679105977" watchObservedRunningTime="2026-01-28 18:47:07.671347803 +0000 UTC m=+695.682874578" Jan 28 18:47:15 crc kubenswrapper[4749]: I0128 18:47:15.701672 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" Jan 28 18:47:27 crc kubenswrapper[4749]: I0128 18:47:27.467815 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:47:27 crc kubenswrapper[4749]: I0128 18:47:27.469533 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.514416 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl"] Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.516634 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.519220 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.532201 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl"] Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.630183 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqlnk\" (UniqueName: \"kubernetes.io/projected/a6755cda-1468-4e58-a4b1-9b93b269ec5f-kube-api-access-fqlnk\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.630271 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.630426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.668924 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89"] Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.670128 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.681824 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89"] Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.731854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqlnk\" (UniqueName: \"kubernetes.io/projected/a6755cda-1468-4e58-a4b1-9b93b269ec5f-kube-api-access-fqlnk\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.731956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.731990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.732022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.732041 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.732061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-628pc\" (UniqueName: \"kubernetes.io/projected/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-kube-api-access-628pc\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.732488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.732510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.760217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqlnk\" (UniqueName: \"kubernetes.io/projected/a6755cda-1468-4e58-a4b1-9b93b269ec5f-kube-api-access-fqlnk\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.833147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.833197 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.833225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-628pc\" (UniqueName: \"kubernetes.io/projected/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-kube-api-access-628pc\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.833752 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.833808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.837659 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:44 crc kubenswrapper[4749]: I0128 18:47:44.851600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-628pc\" (UniqueName: \"kubernetes.io/projected/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-kube-api-access-628pc\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.001370 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.022101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl"] Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.246966 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89"] Jan 28 18:47:45 crc kubenswrapper[4749]: W0128 18:47:45.252728 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb553de9c_6ef6_48a5_bb5b_c6dd0f12f18b.slice/crio-8dc52af63f012c63bc7c084d50d0995d9025143d8881737e9545b8b2ee4f14c1 WatchSource:0}: Error finding container 8dc52af63f012c63bc7c084d50d0995d9025143d8881737e9545b8b2ee4f14c1: Status 404 returned error can't find the container with id 8dc52af63f012c63bc7c084d50d0995d9025143d8881737e9545b8b2ee4f14c1 Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.866376 4749 generic.go:334] "Generic (PLEG): container finished" podID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerID="3f38b5eea77498c016a950816def0d349b46bf9be1be206da6abd86fcdd403a5" exitCode=0 Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.866489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" event={"ID":"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b","Type":"ContainerDied","Data":"3f38b5eea77498c016a950816def0d349b46bf9be1be206da6abd86fcdd403a5"} Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.866781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" event={"ID":"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b","Type":"ContainerStarted","Data":"8dc52af63f012c63bc7c084d50d0995d9025143d8881737e9545b8b2ee4f14c1"} Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.868736 4749 generic.go:334] "Generic (PLEG): container finished" podID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerID="f9a418e6fe7d8f0ff6046741e88cf0bb341921f6d70d65e9ed66c42130ccfbdb" exitCode=0 Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.868770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" event={"ID":"a6755cda-1468-4e58-a4b1-9b93b269ec5f","Type":"ContainerDied","Data":"f9a418e6fe7d8f0ff6046741e88cf0bb341921f6d70d65e9ed66c42130ccfbdb"} Jan 28 18:47:45 crc kubenswrapper[4749]: I0128 18:47:45.868788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" event={"ID":"a6755cda-1468-4e58-a4b1-9b93b269ec5f","Type":"ContainerStarted","Data":"f6eeaaec507aa93885aa512be9a4d6bf148bf0c0db443251c51bcd9ce37d439f"} Jan 28 18:47:47 crc kubenswrapper[4749]: I0128 18:47:47.887358 4749 generic.go:334] "Generic (PLEG): container finished" podID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerID="3fbe12481a773ae5e624eba9c1083fcece2ddcf5a6202f74a7d7bf65d637754e" exitCode=0 Jan 28 18:47:47 crc kubenswrapper[4749]: I0128 18:47:47.887373 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" event={"ID":"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b","Type":"ContainerDied","Data":"3fbe12481a773ae5e624eba9c1083fcece2ddcf5a6202f74a7d7bf65d637754e"} Jan 28 18:47:47 crc kubenswrapper[4749]: I0128 18:47:47.891783 4749 generic.go:334] "Generic (PLEG): container finished" podID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerID="40d19a149f5fd4f47620e9b289acbe771455d5a8b95c1503c60faa86fe614972" exitCode=0 Jan 28 18:47:47 crc kubenswrapper[4749]: I0128 18:47:47.891822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" event={"ID":"a6755cda-1468-4e58-a4b1-9b93b269ec5f","Type":"ContainerDied","Data":"40d19a149f5fd4f47620e9b289acbe771455d5a8b95c1503c60faa86fe614972"} Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.238996 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jqf4n"] Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.240930 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.261855 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jqf4n"] Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.383651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-catalog-content\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.383710 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6rk\" (UniqueName: \"kubernetes.io/projected/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-kube-api-access-cn6rk\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.383793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-utilities\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.485480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-catalog-content\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.485529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6rk\" (UniqueName: \"kubernetes.io/projected/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-kube-api-access-cn6rk\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.485602 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-utilities\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.486056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-utilities\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.486267 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-catalog-content\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.507018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6rk\" (UniqueName: \"kubernetes.io/projected/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-kube-api-access-cn6rk\") pod \"redhat-operators-jqf4n\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.602754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.834193 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jqf4n"] Jan 28 18:47:48 crc kubenswrapper[4749]: W0128 18:47:48.843054 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff28ac0_b07e_4a68_9bf7_89a9322fb7f4.slice/crio-375a02c8440deb613596fc0d1977872a3acff456ba1fca8226c922456771c8fe WatchSource:0}: Error finding container 375a02c8440deb613596fc0d1977872a3acff456ba1fca8226c922456771c8fe: Status 404 returned error can't find the container with id 375a02c8440deb613596fc0d1977872a3acff456ba1fca8226c922456771c8fe Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.900522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqf4n" event={"ID":"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4","Type":"ContainerStarted","Data":"375a02c8440deb613596fc0d1977872a3acff456ba1fca8226c922456771c8fe"} Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.907453 4749 generic.go:334] "Generic (PLEG): container finished" podID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerID="8a38a0ea6a61977086aab87cfb6b9bf033298b06e80d149ac48efd9f574e6bfa" exitCode=0 Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.907521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" event={"ID":"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b","Type":"ContainerDied","Data":"8a38a0ea6a61977086aab87cfb6b9bf033298b06e80d149ac48efd9f574e6bfa"} Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.911041 4749 generic.go:334] "Generic (PLEG): container finished" podID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerID="d492eeea5686cabc8e3278ea9e219fb69266a2345582a9b2503bd24d95eaaa56" exitCode=0 Jan 28 18:47:48 crc kubenswrapper[4749]: I0128 18:47:48.911078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" event={"ID":"a6755cda-1468-4e58-a4b1-9b93b269ec5f","Type":"ContainerDied","Data":"d492eeea5686cabc8e3278ea9e219fb69266a2345582a9b2503bd24d95eaaa56"} Jan 28 18:47:49 crc kubenswrapper[4749]: I0128 18:47:49.918118 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerID="dbf2e4dd33c3fcc053b5b85dfa7c391c649a8992a1c26d2f485fdabb464599da" exitCode=0 Jan 28 18:47:49 crc kubenswrapper[4749]: I0128 18:47:49.918163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqf4n" event={"ID":"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4","Type":"ContainerDied","Data":"dbf2e4dd33c3fcc053b5b85dfa7c391c649a8992a1c26d2f485fdabb464599da"} Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.161370 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.207583 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.320165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-628pc\" (UniqueName: \"kubernetes.io/projected/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-kube-api-access-628pc\") pod \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.320253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-bundle\") pod \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.320284 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-bundle\") pod \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.320315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-util\") pod \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.320407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-util\") pod \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\" (UID: \"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b\") " Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.320457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqlnk\" (UniqueName: \"kubernetes.io/projected/a6755cda-1468-4e58-a4b1-9b93b269ec5f-kube-api-access-fqlnk\") pod \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\" (UID: \"a6755cda-1468-4e58-a4b1-9b93b269ec5f\") " Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.321565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-bundle" (OuterVolumeSpecName: "bundle") pod "b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" (UID: "b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.321594 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-bundle" (OuterVolumeSpecName: "bundle") pod "a6755cda-1468-4e58-a4b1-9b93b269ec5f" (UID: "a6755cda-1468-4e58-a4b1-9b93b269ec5f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.325554 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6755cda-1468-4e58-a4b1-9b93b269ec5f-kube-api-access-fqlnk" (OuterVolumeSpecName: "kube-api-access-fqlnk") pod "a6755cda-1468-4e58-a4b1-9b93b269ec5f" (UID: "a6755cda-1468-4e58-a4b1-9b93b269ec5f"). InnerVolumeSpecName "kube-api-access-fqlnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.325600 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-kube-api-access-628pc" (OuterVolumeSpecName: "kube-api-access-628pc") pod "b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" (UID: "b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b"). InnerVolumeSpecName "kube-api-access-628pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.336644 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-util" (OuterVolumeSpecName: "util") pod "b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" (UID: "b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.342310 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-util" (OuterVolumeSpecName: "util") pod "a6755cda-1468-4e58-a4b1-9b93b269ec5f" (UID: "a6755cda-1468-4e58-a4b1-9b93b269ec5f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.422735 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.422773 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.422783 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6755cda-1468-4e58-a4b1-9b93b269ec5f-util\") on node \"crc\" DevicePath \"\"" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.422794 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-util\") on node \"crc\" DevicePath \"\"" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.422805 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqlnk\" (UniqueName: \"kubernetes.io/projected/a6755cda-1468-4e58-a4b1-9b93b269ec5f-kube-api-access-fqlnk\") on node \"crc\" DevicePath \"\"" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.422817 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-628pc\" (UniqueName: \"kubernetes.io/projected/b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b-kube-api-access-628pc\") on node \"crc\" DevicePath \"\"" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.925059 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqf4n" event={"ID":"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4","Type":"ContainerStarted","Data":"367dc9b0809b56194a275ef626a4ff81f461a5dff9fb1d81075308769a14052f"} Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.927088 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.927083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89" event={"ID":"b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b","Type":"ContainerDied","Data":"8dc52af63f012c63bc7c084d50d0995d9025143d8881737e9545b8b2ee4f14c1"} Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.927252 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc52af63f012c63bc7c084d50d0995d9025143d8881737e9545b8b2ee4f14c1" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.928776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" event={"ID":"a6755cda-1468-4e58-a4b1-9b93b269ec5f","Type":"ContainerDied","Data":"f6eeaaec507aa93885aa512be9a4d6bf148bf0c0db443251c51bcd9ce37d439f"} Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.928801 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6eeaaec507aa93885aa512be9a4d6bf148bf0c0db443251c51bcd9ce37d439f" Jan 28 18:47:50 crc kubenswrapper[4749]: I0128 18:47:50.928826 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl" Jan 28 18:47:51 crc kubenswrapper[4749]: I0128 18:47:51.936560 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerID="367dc9b0809b56194a275ef626a4ff81f461a5dff9fb1d81075308769a14052f" exitCode=0 Jan 28 18:47:51 crc kubenswrapper[4749]: I0128 18:47:51.936676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqf4n" event={"ID":"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4","Type":"ContainerDied","Data":"367dc9b0809b56194a275ef626a4ff81f461a5dff9fb1d81075308769a14052f"} Jan 28 18:47:52 crc kubenswrapper[4749]: I0128 18:47:52.945620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqf4n" event={"ID":"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4","Type":"ContainerStarted","Data":"d9da5185cdf8c51881f03c7381d1fd266d0e73d8e263f2220a56d1a2806a6f39"} Jan 28 18:47:52 crc kubenswrapper[4749]: I0128 18:47:52.965891 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jqf4n" podStartSLOduration=2.227196933 podStartE2EDuration="4.965874673s" podCreationTimestamp="2026-01-28 18:47:48 +0000 UTC" firstStartedPulling="2026-01-28 18:47:49.920575826 +0000 UTC m=+737.932102601" lastFinishedPulling="2026-01-28 18:47:52.659253566 +0000 UTC m=+740.670780341" observedRunningTime="2026-01-28 18:47:52.961126989 +0000 UTC m=+740.972653764" watchObservedRunningTime="2026-01-28 18:47:52.965874673 +0000 UTC m=+740.977401448" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.479759 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g"] Jan 28 18:47:54 crc kubenswrapper[4749]: E0128 18:47:54.480019 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerName="extract" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480033 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerName="extract" Jan 28 18:47:54 crc kubenswrapper[4749]: E0128 18:47:54.480053 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerName="pull" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480059 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerName="pull" Jan 28 18:47:54 crc kubenswrapper[4749]: E0128 18:47:54.480072 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerName="util" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480079 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerName="util" Jan 28 18:47:54 crc kubenswrapper[4749]: E0128 18:47:54.480086 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerName="pull" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480092 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerName="pull" Jan 28 18:47:54 crc kubenswrapper[4749]: E0128 18:47:54.480101 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerName="extract" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480106 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerName="extract" Jan 28 18:47:54 crc kubenswrapper[4749]: E0128 18:47:54.480113 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerName="util" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480119 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerName="util" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480260 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b" containerName="extract" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480270 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6755cda-1468-4e58-a4b1-9b93b269ec5f" containerName="extract" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.480733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.483435 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.483442 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.483713 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-6zmw9" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.492876 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g"] Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.580429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bndp9\" (UniqueName: \"kubernetes.io/projected/c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3-kube-api-access-bndp9\") pod \"cluster-logging-operator-79cf69ddc8-hf99g\" (UID: \"c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.682304 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bndp9\" (UniqueName: \"kubernetes.io/projected/c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3-kube-api-access-bndp9\") pod \"cluster-logging-operator-79cf69ddc8-hf99g\" (UID: \"c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.705906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bndp9\" (UniqueName: \"kubernetes.io/projected/c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3-kube-api-access-bndp9\") pod \"cluster-logging-operator-79cf69ddc8-hf99g\" (UID: \"c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g" Jan 28 18:47:54 crc kubenswrapper[4749]: I0128 18:47:54.798399 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g" Jan 28 18:47:55 crc kubenswrapper[4749]: I0128 18:47:55.062831 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g"] Jan 28 18:47:55 crc kubenswrapper[4749]: I0128 18:47:55.976623 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g" event={"ID":"c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3","Type":"ContainerStarted","Data":"e8ede28b20431646a29d99ae7c2f7f7dd10f5d4f487294698b136d327cfef000"} Jan 28 18:47:57 crc kubenswrapper[4749]: I0128 18:47:57.467729 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:47:57 crc kubenswrapper[4749]: I0128 18:47:57.468161 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:47:57 crc kubenswrapper[4749]: I0128 18:47:57.468216 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:47:57 crc kubenswrapper[4749]: I0128 18:47:57.469404 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04f2d6bfc0ed8ee6f0293f2bd9184234285368c2476cf72bb7d6b248a665883d"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 18:47:57 crc kubenswrapper[4749]: I0128 18:47:57.469484 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://04f2d6bfc0ed8ee6f0293f2bd9184234285368c2476cf72bb7d6b248a665883d" gracePeriod=600 Jan 28 18:47:58 crc kubenswrapper[4749]: I0128 18:47:58.604729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:58 crc kubenswrapper[4749]: I0128 18:47:58.605069 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:58 crc kubenswrapper[4749]: I0128 18:47:58.657648 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:47:59 crc kubenswrapper[4749]: I0128 18:47:59.045404 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:48:00 crc kubenswrapper[4749]: I0128 18:48:00.016972 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="04f2d6bfc0ed8ee6f0293f2bd9184234285368c2476cf72bb7d6b248a665883d" exitCode=0 Jan 28 18:48:00 crc kubenswrapper[4749]: I0128 18:48:00.017435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"04f2d6bfc0ed8ee6f0293f2bd9184234285368c2476cf72bb7d6b248a665883d"} Jan 28 18:48:00 crc kubenswrapper[4749]: I0128 18:48:00.017499 4749 scope.go:117] "RemoveContainer" containerID="1b3f4ad866e9df1f15443a8908d5b0434af59343a634d3e21933697773654ca4" Jan 28 18:48:01 crc kubenswrapper[4749]: I0128 18:48:01.830947 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jqf4n"] Jan 28 18:48:01 crc kubenswrapper[4749]: I0128 18:48:01.831237 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jqf4n" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerName="registry-server" containerID="cri-o://d9da5185cdf8c51881f03c7381d1fd266d0e73d8e263f2220a56d1a2806a6f39" gracePeriod=2 Jan 28 18:48:02 crc kubenswrapper[4749]: I0128 18:48:02.040872 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerID="d9da5185cdf8c51881f03c7381d1fd266d0e73d8e263f2220a56d1a2806a6f39" exitCode=0 Jan 28 18:48:02 crc kubenswrapper[4749]: I0128 18:48:02.040925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqf4n" event={"ID":"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4","Type":"ContainerDied","Data":"d9da5185cdf8c51881f03c7381d1fd266d0e73d8e263f2220a56d1a2806a6f39"} Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.749162 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.830980 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn6rk\" (UniqueName: \"kubernetes.io/projected/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-kube-api-access-cn6rk\") pod \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.831035 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-catalog-content\") pod \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.831114 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-utilities\") pod \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\" (UID: \"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4\") " Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.832216 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-utilities" (OuterVolumeSpecName: "utilities") pod "7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" (UID: "7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.836250 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-kube-api-access-cn6rk" (OuterVolumeSpecName: "kube-api-access-cn6rk") pod "7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" (UID: "7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4"). InnerVolumeSpecName "kube-api-access-cn6rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.932319 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn6rk\" (UniqueName: \"kubernetes.io/projected/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-kube-api-access-cn6rk\") on node \"crc\" DevicePath \"\"" Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.932376 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:48:03 crc kubenswrapper[4749]: I0128 18:48:03.950781 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" (UID: "7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.033216 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.055971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"4e5b313040fcc5d2f4a0e0713dba32c28c08f86a27e8cecbbc5d364a34a7eb3e"} Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.059459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jqf4n" event={"ID":"7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4","Type":"ContainerDied","Data":"375a02c8440deb613596fc0d1977872a3acff456ba1fca8226c922456771c8fe"} Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.059515 4749 scope.go:117] "RemoveContainer" containerID="d9da5185cdf8c51881f03c7381d1fd266d0e73d8e263f2220a56d1a2806a6f39" Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.059538 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jqf4n" Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.090488 4749 scope.go:117] "RemoveContainer" containerID="367dc9b0809b56194a275ef626a4ff81f461a5dff9fb1d81075308769a14052f" Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.096296 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jqf4n"] Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.102629 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jqf4n"] Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.121408 4749 scope.go:117] "RemoveContainer" containerID="dbf2e4dd33c3fcc053b5b85dfa7c391c649a8992a1c26d2f485fdabb464599da" Jan 28 18:48:04 crc kubenswrapper[4749]: I0128 18:48:04.879296 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" path="/var/lib/kubelet/pods/7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4/volumes" Jan 28 18:48:05 crc kubenswrapper[4749]: I0128 18:48:05.067755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g" event={"ID":"c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3","Type":"ContainerStarted","Data":"3cb3749c057c1c52b794b5ee2fbe8d0c9dbc4707fc814ce0a9673a0c920c41cc"} Jan 28 18:48:05 crc kubenswrapper[4749]: I0128 18:48:05.085253 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-hf99g" podStartSLOduration=2.579631238 podStartE2EDuration="11.085227963s" podCreationTimestamp="2026-01-28 18:47:54 +0000 UTC" firstStartedPulling="2026-01-28 18:47:55.077913848 +0000 UTC m=+743.089440623" lastFinishedPulling="2026-01-28 18:48:03.583510573 +0000 UTC m=+751.595037348" observedRunningTime="2026-01-28 18:48:05.082105748 +0000 UTC m=+753.093632533" watchObservedRunningTime="2026-01-28 18:48:05.085227963 +0000 UTC m=+753.096754738" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.281290 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw"] Jan 28 18:48:06 crc kubenswrapper[4749]: E0128 18:48:06.281640 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerName="extract-content" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.281659 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerName="extract-content" Jan 28 18:48:06 crc kubenswrapper[4749]: E0128 18:48:06.281685 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerName="extract-utilities" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.281693 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerName="extract-utilities" Jan 28 18:48:06 crc kubenswrapper[4749]: E0128 18:48:06.281708 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerName="registry-server" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.281716 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerName="registry-server" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.281886 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff28ac0-b07e-4a68-9bf7-89a9322fb7f4" containerName="registry-server" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.282618 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.284245 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.285011 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.285077 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.285198 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-w5d42" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.285495 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.296776 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.297093 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw"] Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.368022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-webhook-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.368121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5ee54d7a-a2c0-4238-be42-ba0843c776ef-manager-config\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.368159 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.368231 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4c9\" (UniqueName: \"kubernetes.io/projected/5ee54d7a-a2c0-4238-be42-ba0843c776ef-kube-api-access-gm4c9\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.368275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-apiservice-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.469709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5ee54d7a-a2c0-4238-be42-ba0843c776ef-manager-config\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.470160 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.470234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4c9\" (UniqueName: \"kubernetes.io/projected/5ee54d7a-a2c0-4238-be42-ba0843c776ef-kube-api-access-gm4c9\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.470310 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-apiservice-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.470379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-webhook-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.471417 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5ee54d7a-a2c0-4238-be42-ba0843c776ef-manager-config\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.477978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.478480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-apiservice-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.480025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ee54d7a-a2c0-4238-be42-ba0843c776ef-webhook-cert\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.502089 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4c9\" (UniqueName: \"kubernetes.io/projected/5ee54d7a-a2c0-4238-be42-ba0843c776ef-kube-api-access-gm4c9\") pod \"loki-operator-controller-manager-76ff55d55d-mdgpw\" (UID: \"5ee54d7a-a2c0-4238-be42-ba0843c776ef\") " pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:06 crc kubenswrapper[4749]: I0128 18:48:06.597447 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:07 crc kubenswrapper[4749]: I0128 18:48:07.181237 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw"] Jan 28 18:48:08 crc kubenswrapper[4749]: I0128 18:48:08.088786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" event={"ID":"5ee54d7a-a2c0-4238-be42-ba0843c776ef","Type":"ContainerStarted","Data":"9e3ea99ee11fb9778156805556e8be96786080cf66ccdfeee7ba9be50cba8be2"} Jan 28 18:48:10 crc kubenswrapper[4749]: I0128 18:48:10.116229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" event={"ID":"5ee54d7a-a2c0-4238-be42-ba0843c776ef","Type":"ContainerStarted","Data":"a237098fec60e8dc4584c138e5b604540497801631327cd69211140cf26b21ab"} Jan 28 18:48:16 crc kubenswrapper[4749]: I0128 18:48:16.182638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" event={"ID":"5ee54d7a-a2c0-4238-be42-ba0843c776ef","Type":"ContainerStarted","Data":"057421625afa02f9798ef6931147193f8d13a87c5b25f79e565b1288c949eabb"} Jan 28 18:48:16 crc kubenswrapper[4749]: I0128 18:48:16.184371 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:16 crc kubenswrapper[4749]: I0128 18:48:16.186022 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" Jan 28 18:48:16 crc kubenswrapper[4749]: I0128 18:48:16.218137 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-76ff55d55d-mdgpw" podStartSLOduration=1.688091088 podStartE2EDuration="10.218116462s" podCreationTimestamp="2026-01-28 18:48:06 +0000 UTC" firstStartedPulling="2026-01-28 18:48:07.174628003 +0000 UTC m=+755.186154778" lastFinishedPulling="2026-01-28 18:48:15.704653377 +0000 UTC m=+763.716180152" observedRunningTime="2026-01-28 18:48:16.211483404 +0000 UTC m=+764.223010199" watchObservedRunningTime="2026-01-28 18:48:16.218116462 +0000 UTC m=+764.229643227" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.669124 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.670629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.673246 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.673422 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.679818 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.691913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-604f688c-523e-48a6-a3f6-c066d8c1e32a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-604f688c-523e-48a6-a3f6-c066d8c1e32a\") pod \"minio\" (UID: \"02a91f4e-38f3-4330-8e86-b9ac4ce1398a\") " pod="minio-dev/minio" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.691962 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqcp2\" (UniqueName: \"kubernetes.io/projected/02a91f4e-38f3-4330-8e86-b9ac4ce1398a-kube-api-access-bqcp2\") pod \"minio\" (UID: \"02a91f4e-38f3-4330-8e86-b9ac4ce1398a\") " pod="minio-dev/minio" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.793161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-604f688c-523e-48a6-a3f6-c066d8c1e32a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-604f688c-523e-48a6-a3f6-c066d8c1e32a\") pod \"minio\" (UID: \"02a91f4e-38f3-4330-8e86-b9ac4ce1398a\") " pod="minio-dev/minio" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.793529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqcp2\" (UniqueName: \"kubernetes.io/projected/02a91f4e-38f3-4330-8e86-b9ac4ce1398a-kube-api-access-bqcp2\") pod \"minio\" (UID: \"02a91f4e-38f3-4330-8e86-b9ac4ce1398a\") " pod="minio-dev/minio" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.813742 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.814021 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-604f688c-523e-48a6-a3f6-c066d8c1e32a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-604f688c-523e-48a6-a3f6-c066d8c1e32a\") pod \"minio\" (UID: \"02a91f4e-38f3-4330-8e86-b9ac4ce1398a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e61df84872be01f8117bfe093c1beea3b6dc412bc11ffad45b4d75b0f59da1f/globalmount\"" pod="minio-dev/minio" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.826994 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqcp2\" (UniqueName: \"kubernetes.io/projected/02a91f4e-38f3-4330-8e86-b9ac4ce1398a-kube-api-access-bqcp2\") pod \"minio\" (UID: \"02a91f4e-38f3-4330-8e86-b9ac4ce1398a\") " pod="minio-dev/minio" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.880780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-604f688c-523e-48a6-a3f6-c066d8c1e32a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-604f688c-523e-48a6-a3f6-c066d8c1e32a\") pod \"minio\" (UID: \"02a91f4e-38f3-4330-8e86-b9ac4ce1398a\") " pod="minio-dev/minio" Jan 28 18:48:20 crc kubenswrapper[4749]: I0128 18:48:20.996484 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 28 18:48:21 crc kubenswrapper[4749]: I0128 18:48:21.421908 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 28 18:48:22 crc kubenswrapper[4749]: I0128 18:48:22.267735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"02a91f4e-38f3-4330-8e86-b9ac4ce1398a","Type":"ContainerStarted","Data":"ba98451dc98c54e04b944a44cb17ed7300fbd5ea814b5df498f174effb57f679"} Jan 28 18:48:25 crc kubenswrapper[4749]: I0128 18:48:25.289937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"02a91f4e-38f3-4330-8e86-b9ac4ce1398a","Type":"ContainerStarted","Data":"897f877588fd4110043dec9dbbc2d9b7bb411ce7ea4228450064babf2d430ab4"} Jan 28 18:48:25 crc kubenswrapper[4749]: I0128 18:48:25.307384 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.0599641890000004 podStartE2EDuration="7.307363361s" podCreationTimestamp="2026-01-28 18:48:18 +0000 UTC" firstStartedPulling="2026-01-28 18:48:21.430083566 +0000 UTC m=+769.441610341" lastFinishedPulling="2026-01-28 18:48:24.677482738 +0000 UTC m=+772.689009513" observedRunningTime="2026-01-28 18:48:25.300831874 +0000 UTC m=+773.312358669" watchObservedRunningTime="2026-01-28 18:48:25.307363361 +0000 UTC m=+773.318890136" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.636719 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc"] Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.637983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.640233 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.640463 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-lvcjn" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.641263 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.642940 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.649343 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc"] Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.651791 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.742449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.742512 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9px75\" (UniqueName: \"kubernetes.io/projected/eb71e057-ee31-4787-a3e4-d58815f9923e-kube-api-access-9px75\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.742531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.742554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.742578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb71e057-ee31-4787-a3e4-d58815f9923e-config\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.789114 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-qphbq"] Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.789968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.792748 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.793350 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.793552 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.812250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-qphbq"] Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.843850 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84872132-539a-41fc-9345-06b105caae61-config\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.843930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r7dx\" (UniqueName: \"kubernetes.io/projected/84872132-539a-41fc-9345-06b105caae61-kube-api-access-4r7dx\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-s3\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9px75\" (UniqueName: \"kubernetes.io/projected/eb71e057-ee31-4787-a3e4-d58815f9923e-kube-api-access-9px75\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84872132-539a-41fc-9345-06b105caae61-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb71e057-ee31-4787-a3e4-d58815f9923e-config\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.844628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.845287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb71e057-ee31-4787-a3e4-d58815f9923e-config\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.845314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.850321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.851220 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/eb71e057-ee31-4787-a3e4-d58815f9923e-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.872013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9px75\" (UniqueName: \"kubernetes.io/projected/eb71e057-ee31-4787-a3e4-d58815f9923e-kube-api-access-9px75\") pod \"logging-loki-distributor-5f678c8dd6-qgkbc\" (UID: \"eb71e057-ee31-4787-a3e4-d58815f9923e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.907520 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd"] Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.908661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.911398 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.911499 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.926879 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd"] Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946070 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745b517a-16b6-4319-a44d-8976b8659a23-config\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946181 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946303 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946386 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84872132-539a-41fc-9345-06b105caae61-config\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r7dx\" (UniqueName: \"kubernetes.io/projected/84872132-539a-41fc-9345-06b105caae61-kube-api-access-4r7dx\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-s3\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdmp\" (UniqueName: \"kubernetes.io/projected/745b517a-16b6-4319-a44d-8976b8659a23-kube-api-access-zmdmp\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.946633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84872132-539a-41fc-9345-06b105caae61-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.949572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84872132-539a-41fc-9345-06b105caae61-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.950339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84872132-539a-41fc-9345-06b105caae61-config\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.960898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.961672 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-s3\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.964992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/84872132-539a-41fc-9345-06b105caae61-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.970721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r7dx\" (UniqueName: \"kubernetes.io/projected/84872132-539a-41fc-9345-06b105caae61-kube-api-access-4r7dx\") pod \"logging-loki-querier-76788598db-qphbq\" (UID: \"84872132-539a-41fc-9345-06b105caae61\") " pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:30 crc kubenswrapper[4749]: I0128 18:48:30.985754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.045973 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.047800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdmp\" (UniqueName: \"kubernetes.io/projected/745b517a-16b6-4319-a44d-8976b8659a23-kube-api-access-zmdmp\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.047870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.047908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745b517a-16b6-4319-a44d-8976b8659a23-config\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.047954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.047981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.049008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.049821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745b517a-16b6-4319-a44d-8976b8659a23-config\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.058021 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.059161 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.066180 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.066468 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.066711 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.066820 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.069802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/745b517a-16b6-4319-a44d-8976b8659a23-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.081478 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.105912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdmp\" (UniqueName: \"kubernetes.io/projected/745b517a-16b6-4319-a44d-8976b8659a23-kube-api-access-zmdmp\") pod \"logging-loki-query-frontend-69d9546745-cfwmd\" (UID: \"745b517a-16b6-4319-a44d-8976b8659a23\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.108909 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.109549 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.110020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.113523 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-46mmg" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.121354 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.140516 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149410 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149540 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tls-secret\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-rbac\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-rbac\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m75l6\" (UniqueName: \"kubernetes.io/projected/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-kube-api-access-m75l6\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tenants\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149744 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-lokistack-gateway\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149770 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tls-secret\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d99nw\" (UniqueName: \"kubernetes.io/projected/cdcc709f-ccfa-4927-a1a3-333e7f810817-kube-api-access-d99nw\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tenants\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.149905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-lokistack-gateway\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.251508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-rbac\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-rbac\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m75l6\" (UniqueName: \"kubernetes.io/projected/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-kube-api-access-m75l6\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tenants\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252140 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-lokistack-gateway\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tls-secret\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d99nw\" (UniqueName: \"kubernetes.io/projected/cdcc709f-ccfa-4927-a1a3-333e7f810817-kube-api-access-d99nw\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252264 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tenants\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-lokistack-gateway\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tls-secret\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: E0128 18:48:31.252607 4749 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Jan 28 18:48:31 crc kubenswrapper[4749]: E0128 18:48:31.252672 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tls-secret podName:ca507b0e-375b-47b0-bd5e-c77f2bc7d521 nodeName:}" failed. No retries permitted until 2026-01-28 18:48:31.752647883 +0000 UTC m=+779.764174658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tls-secret") pod "logging-loki-gateway-5b6db5567f-7qbgw" (UID: "ca507b0e-375b-47b0-bd5e-c77f2bc7d521") : secret "logging-loki-gateway-http" not found Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.252787 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-rbac\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: E0128 18:48:31.253962 4749 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Jan 28 18:48:31 crc kubenswrapper[4749]: E0128 18:48:31.254008 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tls-secret podName:cdcc709f-ccfa-4927-a1a3-333e7f810817 nodeName:}" failed. No retries permitted until 2026-01-28 18:48:31.753995835 +0000 UTC m=+779.765522700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tls-secret") pod "logging-loki-gateway-5b6db5567f-9vjtq" (UID: "cdcc709f-ccfa-4927-a1a3-333e7f810817") : secret "logging-loki-gateway-http" not found Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.254060 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.254697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-lokistack-gateway\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.254699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.254860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.255740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-rbac\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.256303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-lokistack-gateway\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.257589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tenants\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.257695 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.257842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.264359 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.265457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.277362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d99nw\" (UniqueName: \"kubernetes.io/projected/cdcc709f-ccfa-4927-a1a3-333e7f810817-kube-api-access-d99nw\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.279589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m75l6\" (UniqueName: \"kubernetes.io/projected/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-kube-api-access-m75l6\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.280530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tenants\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.556501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.567369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.661096 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-qphbq"] Jan 28 18:48:31 crc kubenswrapper[4749]: W0128 18:48:31.665887 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84872132_539a_41fc_9345_06b105caae61.slice/crio-4ab2481984e327b922f28ad206c753d8a63e8cb9b5db24bcea0a300e5bee69f0 WatchSource:0}: Error finding container 4ab2481984e327b922f28ad206c753d8a63e8cb9b5db24bcea0a300e5bee69f0: Status 404 returned error can't find the container with id 4ab2481984e327b922f28ad206c753d8a63e8cb9b5db24bcea0a300e5bee69f0 Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.761772 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tls-secret\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.761908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tls-secret\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.767236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ca507b0e-375b-47b0-bd5e-c77f2bc7d521-tls-secret\") pod \"logging-loki-gateway-5b6db5567f-7qbgw\" (UID: \"ca507b0e-375b-47b0-bd5e-c77f2bc7d521\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.767913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/cdcc709f-ccfa-4927-a1a3-333e7f810817-tls-secret\") pod \"logging-loki-gateway-5b6db5567f-9vjtq\" (UID: \"cdcc709f-ccfa-4927-a1a3-333e7f810817\") " pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.798366 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.799228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.801356 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.801687 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.813268 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.864406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5a523318-6d84-4988-8bc9-e2f85b97fe5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a523318-6d84-4988-8bc9-e2f85b97fe5a\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.878998 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.880367 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.883445 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.883708 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.890804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.966280 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h76f9\" (UniqueName: \"kubernetes.io/projected/74be48ce-557a-4f08-931e-5f222458fbe3-kube-api-access-h76f9\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.966341 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74be48ce-557a-4f08-931e-5f222458fbe3-config\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.966360 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.966503 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.966670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.966718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.966796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-df634edb-e417-49d1-ba60-3accf87fae54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df634edb-e417-49d1-ba60-3accf87fae54\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.966859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5a523318-6d84-4988-8bc9-e2f85b97fe5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a523318-6d84-4988-8bc9-e2f85b97fe5a\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.970295 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.970361 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5a523318-6d84-4988-8bc9-e2f85b97fe5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a523318-6d84-4988-8bc9-e2f85b97fe5a\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da1aaaabba0b2e05a739b794579331655278f240a6b8acd08a98365856551a6f/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.985280 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.986251 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.990522 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.991306 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.992682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:31 crc kubenswrapper[4749]: I0128 18:48:31.998723 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.027551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5a523318-6d84-4988-8bc9-e2f85b97fe5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a523318-6d84-4988-8bc9-e2f85b97fe5a\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.042978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h76f9\" (UniqueName: \"kubernetes.io/projected/74be48ce-557a-4f08-931e-5f222458fbe3-kube-api-access-h76f9\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74be48ce-557a-4f08-931e-5f222458fbe3-config\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39f97e52-cfdc-487c-b88a-a315c4c4d651-config\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072613 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-122a4947-6872-48b9-9105-f19577343ecb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-122a4947-6872-48b9-9105-f19577343ecb\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072752 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxsxx\" (UniqueName: \"kubernetes.io/projected/39f97e52-cfdc-487c-b88a-a315c4c4d651-kube-api-access-xxsxx\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.072790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-df634edb-e417-49d1-ba60-3accf87fae54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df634edb-e417-49d1-ba60-3accf87fae54\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.074588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74be48ce-557a-4f08-931e-5f222458fbe3-config\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.075270 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.076917 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.076939 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-df634edb-e417-49d1-ba60-3accf87fae54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df634edb-e417-49d1-ba60-3accf87fae54\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/385613cb199c3e433aa8da05245cf5142e0c03b0af1f3e1f7a75c9cfa5e59772/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.084510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.085478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.087289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/74be48ce-557a-4f08-931e-5f222458fbe3-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.095504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h76f9\" (UniqueName: \"kubernetes.io/projected/74be48ce-557a-4f08-931e-5f222458fbe3-kube-api-access-h76f9\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.113504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-df634edb-e417-49d1-ba60-3accf87fae54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df634edb-e417-49d1-ba60-3accf87fae54\") pod \"logging-loki-ingester-0\" (UID: \"74be48ce-557a-4f08-931e-5f222458fbe3\") " pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.120796 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175271 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8811341b-2713-4cf1-8b60-e9a8bdc02dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8811341b-2713-4cf1-8b60-e9a8bdc02dae\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175395 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175442 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39f97e52-cfdc-487c-b88a-a315c4c4d651-config\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-122a4947-6872-48b9-9105-f19577343ecb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-122a4947-6872-48b9-9105-f19577343ecb\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzsf\" (UniqueName: \"kubernetes.io/projected/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-kube-api-access-vpzsf\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-config\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxsxx\" (UniqueName: \"kubernetes.io/projected/39f97e52-cfdc-487c-b88a-a315c4c4d651-kube-api-access-xxsxx\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.175660 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.177129 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.180029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39f97e52-cfdc-487c-b88a-a315c4c4d651-config\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.190022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.193366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.193506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/39f97e52-cfdc-487c-b88a-a315c4c4d651-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.204106 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.204155 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-122a4947-6872-48b9-9105-f19577343ecb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-122a4947-6872-48b9-9105-f19577343ecb\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/eae25c355fa7b53338413b7b839cd45f0fcf182a9eb8d7d926718a261cac5449/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.211191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxsxx\" (UniqueName: \"kubernetes.io/projected/39f97e52-cfdc-487c-b88a-a315c4c4d651-kube-api-access-xxsxx\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.284136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.284212 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8811341b-2713-4cf1-8b60-e9a8bdc02dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8811341b-2713-4cf1-8b60-e9a8bdc02dae\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.284238 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.284300 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.284346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzsf\" (UniqueName: \"kubernetes.io/projected/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-kube-api-access-vpzsf\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.284366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-config\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.284401 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.291378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.291993 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.296765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-config\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.299320 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.299422 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8811341b-2713-4cf1-8b60-e9a8bdc02dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8811341b-2713-4cf1-8b60-e9a8bdc02dae\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e49e6ceb2b97bce3bac5ae98efa23c53a3b875998fc8611cda295b28b4d6b4a5/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.300888 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.301988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.322716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzsf\" (UniqueName: \"kubernetes.io/projected/cb1f1961-f7f5-4c47-93f9-7f06ac02b45c-kube-api-access-vpzsf\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.342807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8811341b-2713-4cf1-8b60-e9a8bdc02dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8811341b-2713-4cf1-8b60-e9a8bdc02dae\") pod \"logging-loki-index-gateway-0\" (UID: \"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.353922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-122a4947-6872-48b9-9105-f19577343ecb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-122a4947-6872-48b9-9105-f19577343ecb\") pod \"logging-loki-compactor-0\" (UID: \"39f97e52-cfdc-487c-b88a-a315c4c4d651\") " pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.355536 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" event={"ID":"745b517a-16b6-4319-a44d-8976b8659a23","Type":"ContainerStarted","Data":"1001d4c773bf1364e1b6ffa1d7199c450d163e53e627598674ef74a4f3e4e6a9"} Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.356919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-qphbq" event={"ID":"84872132-539a-41fc-9345-06b105caae61","Type":"ContainerStarted","Data":"4ab2481984e327b922f28ad206c753d8a63e8cb9b5db24bcea0a300e5bee69f0"} Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.358103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" event={"ID":"eb71e057-ee31-4787-a3e4-d58815f9923e","Type":"ContainerStarted","Data":"b5e05ccff86724cde4360467b0e693b9ca27812961e1fd6e170d707ca9585890"} Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.415922 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.503838 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq"] Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.532143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.669027 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw"] Jan 28 18:48:32 crc kubenswrapper[4749]: W0128 18:48:32.674788 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca507b0e_375b_47b0_bd5e_c77f2bc7d521.slice/crio-4f16a5803c2c6933d561b5da18137433f6da4f7a4b257130205b33a3b9e83285 WatchSource:0}: Error finding container 4f16a5803c2c6933d561b5da18137433f6da4f7a4b257130205b33a3b9e83285: Status 404 returned error can't find the container with id 4f16a5803c2c6933d561b5da18137433f6da4f7a4b257130205b33a3b9e83285 Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.725343 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 28 18:48:32 crc kubenswrapper[4749]: W0128 18:48:32.728049 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74be48ce_557a_4f08_931e_5f222458fbe3.slice/crio-c7fa02381cc6df60ceb0226b067833613f24be789e93a67f422e6fa0ff890a9e WatchSource:0}: Error finding container c7fa02381cc6df60ceb0226b067833613f24be789e93a67f422e6fa0ff890a9e: Status 404 returned error can't find the container with id c7fa02381cc6df60ceb0226b067833613f24be789e93a67f422e6fa0ff890a9e Jan 28 18:48:32 crc kubenswrapper[4749]: I0128 18:48:32.835694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 28 18:48:33 crc kubenswrapper[4749]: I0128 18:48:33.023790 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 28 18:48:33 crc kubenswrapper[4749]: W0128 18:48:33.033684 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39f97e52_cfdc_487c_b88a_a315c4c4d651.slice/crio-9a976cd49e76a68229049ba9b5c0c9df464b3eb38ab3dde0723c6c632a6c1753 WatchSource:0}: Error finding container 9a976cd49e76a68229049ba9b5c0c9df464b3eb38ab3dde0723c6c632a6c1753: Status 404 returned error can't find the container with id 9a976cd49e76a68229049ba9b5c0c9df464b3eb38ab3dde0723c6c632a6c1753 Jan 28 18:48:33 crc kubenswrapper[4749]: I0128 18:48:33.382154 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c","Type":"ContainerStarted","Data":"7502673f3c3583f935f92c159b5b2b3471fb33bc06c7716576caf2933511692c"} Jan 28 18:48:33 crc kubenswrapper[4749]: I0128 18:48:33.384922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" event={"ID":"cdcc709f-ccfa-4927-a1a3-333e7f810817","Type":"ContainerStarted","Data":"f5bb1246b7b6ab386ea0d71daf1bf92e641d6aa3deae6d5ab28aff8b31aea081"} Jan 28 18:48:33 crc kubenswrapper[4749]: I0128 18:48:33.388716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"39f97e52-cfdc-487c-b88a-a315c4c4d651","Type":"ContainerStarted","Data":"9a976cd49e76a68229049ba9b5c0c9df464b3eb38ab3dde0723c6c632a6c1753"} Jan 28 18:48:33 crc kubenswrapper[4749]: I0128 18:48:33.390366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"74be48ce-557a-4f08-931e-5f222458fbe3","Type":"ContainerStarted","Data":"c7fa02381cc6df60ceb0226b067833613f24be789e93a67f422e6fa0ff890a9e"} Jan 28 18:48:33 crc kubenswrapper[4749]: I0128 18:48:33.392238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" event={"ID":"ca507b0e-375b-47b0-bd5e-c77f2bc7d521","Type":"ContainerStarted","Data":"4f16a5803c2c6933d561b5da18137433f6da4f7a4b257130205b33a3b9e83285"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.426214 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"cb1f1961-f7f5-4c47-93f9-7f06ac02b45c","Type":"ContainerStarted","Data":"e810760d94d3799d4b12f9ce9190285751bbaec1f032293107e2699fb0733ec8"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.426806 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.428789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" event={"ID":"cdcc709f-ccfa-4927-a1a3-333e7f810817","Type":"ContainerStarted","Data":"c61f0c47a4b6a9927099dfd67f4e4623330a9a30ab383a184b40e90652690cbe"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.430079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"39f97e52-cfdc-487c-b88a-a315c4c4d651","Type":"ContainerStarted","Data":"8ba0ec5a69156c73b02c0bab4c6c0e11a13929bbab5754c5bd4f607ddb59b919"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.430216 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.431752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"74be48ce-557a-4f08-931e-5f222458fbe3","Type":"ContainerStarted","Data":"60135f3010f4a82e75ae495ae2e06dafb399195dfdf7cfa0e03cbedf7b5672c0"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.431865 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.433683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" event={"ID":"eb71e057-ee31-4787-a3e4-d58815f9923e","Type":"ContainerStarted","Data":"f5a0e9ccede3ad03a411dd5131147363321668c6cab31987241aeaff910dc1a5"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.433755 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.434909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" event={"ID":"ca507b0e-375b-47b0-bd5e-c77f2bc7d521","Type":"ContainerStarted","Data":"e4b2139f16900b2f21e5dfe7e6bc9d330900d16a2fbe1cd89d6605c6da863592"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.440143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" event={"ID":"745b517a-16b6-4319-a44d-8976b8659a23","Type":"ContainerStarted","Data":"2487ada662b78f13ea5ae65d3b5ce3ce526c129e6ada593ee6651931124f2733"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.440225 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.454108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-qphbq" event={"ID":"84872132-539a-41fc-9345-06b105caae61","Type":"ContainerStarted","Data":"6e22c902436a6e93a61251ec8a4aa75c379852b3fc4cd85784b1f62f4f9d5016"} Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.454271 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.480302 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.921862962 podStartE2EDuration="7.48027286s" podCreationTimestamp="2026-01-28 18:48:30 +0000 UTC" firstStartedPulling="2026-01-28 18:48:32.858529867 +0000 UTC m=+780.870056662" lastFinishedPulling="2026-01-28 18:48:36.416939785 +0000 UTC m=+784.428466560" observedRunningTime="2026-01-28 18:48:37.472924681 +0000 UTC m=+785.484451476" watchObservedRunningTime="2026-01-28 18:48:37.48027286 +0000 UTC m=+785.491799635" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.503241 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.8195623530000002 podStartE2EDuration="7.503214057s" podCreationTimestamp="2026-01-28 18:48:30 +0000 UTC" firstStartedPulling="2026-01-28 18:48:33.036279211 +0000 UTC m=+781.047805986" lastFinishedPulling="2026-01-28 18:48:36.719930915 +0000 UTC m=+784.731457690" observedRunningTime="2026-01-28 18:48:37.49712787 +0000 UTC m=+785.508654645" watchObservedRunningTime="2026-01-28 18:48:37.503214057 +0000 UTC m=+785.514740832" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.566613 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" podStartSLOduration=2.693439803 podStartE2EDuration="7.56659351s" podCreationTimestamp="2026-01-28 18:48:30 +0000 UTC" firstStartedPulling="2026-01-28 18:48:31.568237702 +0000 UTC m=+779.579764477" lastFinishedPulling="2026-01-28 18:48:36.441391409 +0000 UTC m=+784.452918184" observedRunningTime="2026-01-28 18:48:37.524578518 +0000 UTC m=+785.536105293" watchObservedRunningTime="2026-01-28 18:48:37.56659351 +0000 UTC m=+785.578120285" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.601002 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76788598db-qphbq" podStartSLOduration=2.6224659949999998 podStartE2EDuration="7.600937955s" podCreationTimestamp="2026-01-28 18:48:30 +0000 UTC" firstStartedPulling="2026-01-28 18:48:31.669668679 +0000 UTC m=+779.681195454" lastFinishedPulling="2026-01-28 18:48:36.648140639 +0000 UTC m=+784.659667414" observedRunningTime="2026-01-28 18:48:37.59909261 +0000 UTC m=+785.610619395" watchObservedRunningTime="2026-01-28 18:48:37.600937955 +0000 UTC m=+785.612464740" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.601475 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.8854935189999997 podStartE2EDuration="7.601468518s" podCreationTimestamp="2026-01-28 18:48:30 +0000 UTC" firstStartedPulling="2026-01-28 18:48:32.731268342 +0000 UTC m=+780.742795117" lastFinishedPulling="2026-01-28 18:48:36.447243351 +0000 UTC m=+784.458770116" observedRunningTime="2026-01-28 18:48:37.564753544 +0000 UTC m=+785.576280329" watchObservedRunningTime="2026-01-28 18:48:37.601468518 +0000 UTC m=+785.612995303" Jan 28 18:48:37 crc kubenswrapper[4749]: I0128 18:48:37.627016 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" podStartSLOduration=2.724287052 podStartE2EDuration="7.626999869s" podCreationTimestamp="2026-01-28 18:48:30 +0000 UTC" firstStartedPulling="2026-01-28 18:48:31.572180397 +0000 UTC m=+779.583707172" lastFinishedPulling="2026-01-28 18:48:36.474893214 +0000 UTC m=+784.486419989" observedRunningTime="2026-01-28 18:48:37.622985911 +0000 UTC m=+785.634512686" watchObservedRunningTime="2026-01-28 18:48:37.626999869 +0000 UTC m=+785.638526644" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.477664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" event={"ID":"ca507b0e-375b-47b0-bd5e-c77f2bc7d521","Type":"ContainerStarted","Data":"654e7c035cd7aecab54d92c7f39b7ea00cfb987af1c2efee67ec32baf7216364"} Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.478611 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.478679 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.480345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" event={"ID":"cdcc709f-ccfa-4927-a1a3-333e7f810817","Type":"ContainerStarted","Data":"31fef1939f4b2ca218b1e555d8f08010306fbf9c482f007fbdf0f60bdefdb72b"} Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.480556 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.480596 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.487908 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.489218 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.493752 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.495371 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.499136 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" podStartSLOduration=2.480552788 podStartE2EDuration="8.499116217s" podCreationTimestamp="2026-01-28 18:48:31 +0000 UTC" firstStartedPulling="2026-01-28 18:48:32.678680203 +0000 UTC m=+780.690206968" lastFinishedPulling="2026-01-28 18:48:38.697243622 +0000 UTC m=+786.708770397" observedRunningTime="2026-01-28 18:48:39.497605981 +0000 UTC m=+787.509132776" watchObservedRunningTime="2026-01-28 18:48:39.499116217 +0000 UTC m=+787.510642992" Jan 28 18:48:39 crc kubenswrapper[4749]: I0128 18:48:39.545952 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5b6db5567f-9vjtq" podStartSLOduration=2.347019429 podStartE2EDuration="8.545932466s" podCreationTimestamp="2026-01-28 18:48:31 +0000 UTC" firstStartedPulling="2026-01-28 18:48:32.514096969 +0000 UTC m=+780.525623744" lastFinishedPulling="2026-01-28 18:48:38.713010006 +0000 UTC m=+786.724536781" observedRunningTime="2026-01-28 18:48:39.544581023 +0000 UTC m=+787.556107808" watchObservedRunningTime="2026-01-28 18:48:39.545932466 +0000 UTC m=+787.557459241" Jan 28 18:48:51 crc kubenswrapper[4749]: I0128 18:48:51.122017 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76788598db-qphbq" Jan 28 18:48:51 crc kubenswrapper[4749]: I0128 18:48:51.263468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-69d9546745-cfwmd" Jan 28 18:48:52 crc kubenswrapper[4749]: I0128 18:48:52.126965 4749 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Jan 28 18:48:52 crc kubenswrapper[4749]: I0128 18:48:52.127024 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="74be48ce-557a-4f08-931e-5f222458fbe3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 18:48:52 crc kubenswrapper[4749]: I0128 18:48:52.422568 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Jan 28 18:48:52 crc kubenswrapper[4749]: I0128 18:48:52.538715 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.519206 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xskbs"] Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.520832 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.537880 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xskbs"] Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.641772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-utilities\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.641894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-catalog-content\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.641963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwf2q\" (UniqueName: \"kubernetes.io/projected/895b587c-12d6-4bac-802f-6bf48ee86511-kube-api-access-mwf2q\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.743005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-catalog-content\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.743111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwf2q\" (UniqueName: \"kubernetes.io/projected/895b587c-12d6-4bac-802f-6bf48ee86511-kube-api-access-mwf2q\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.743157 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-utilities\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.743711 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-utilities\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.743708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-catalog-content\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.792278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwf2q\" (UniqueName: \"kubernetes.io/projected/895b587c-12d6-4bac-802f-6bf48ee86511-kube-api-access-mwf2q\") pod \"community-operators-xskbs\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:53 crc kubenswrapper[4749]: I0128 18:48:53.842979 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:48:54 crc kubenswrapper[4749]: I0128 18:48:54.350786 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xskbs"] Jan 28 18:48:54 crc kubenswrapper[4749]: I0128 18:48:54.585308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xskbs" event={"ID":"895b587c-12d6-4bac-802f-6bf48ee86511","Type":"ContainerStarted","Data":"a47628e48977a04f4db365ab99d6249d7980f0d9fa3983b6c9bce0c8e43a04e4"} Jan 28 18:48:55 crc kubenswrapper[4749]: I0128 18:48:55.593660 4749 generic.go:334] "Generic (PLEG): container finished" podID="895b587c-12d6-4bac-802f-6bf48ee86511" containerID="ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd" exitCode=0 Jan 28 18:48:55 crc kubenswrapper[4749]: I0128 18:48:55.593704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xskbs" event={"ID":"895b587c-12d6-4bac-802f-6bf48ee86511","Type":"ContainerDied","Data":"ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd"} Jan 28 18:48:56 crc kubenswrapper[4749]: I0128 18:48:56.603593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xskbs" event={"ID":"895b587c-12d6-4bac-802f-6bf48ee86511","Type":"ContainerStarted","Data":"456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c"} Jan 28 18:48:57 crc kubenswrapper[4749]: I0128 18:48:57.612228 4749 generic.go:334] "Generic (PLEG): container finished" podID="895b587c-12d6-4bac-802f-6bf48ee86511" containerID="456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c" exitCode=0 Jan 28 18:48:57 crc kubenswrapper[4749]: I0128 18:48:57.612357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xskbs" event={"ID":"895b587c-12d6-4bac-802f-6bf48ee86511","Type":"ContainerDied","Data":"456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c"} Jan 28 18:48:58 crc kubenswrapper[4749]: I0128 18:48:58.621454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xskbs" event={"ID":"895b587c-12d6-4bac-802f-6bf48ee86511","Type":"ContainerStarted","Data":"259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02"} Jan 28 18:48:58 crc kubenswrapper[4749]: I0128 18:48:58.645467 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xskbs" podStartSLOduration=3.083314933 podStartE2EDuration="5.645441746s" podCreationTimestamp="2026-01-28 18:48:53 +0000 UTC" firstStartedPulling="2026-01-28 18:48:55.595161978 +0000 UTC m=+803.606688763" lastFinishedPulling="2026-01-28 18:48:58.157288801 +0000 UTC m=+806.168815576" observedRunningTime="2026-01-28 18:48:58.641721815 +0000 UTC m=+806.653248600" watchObservedRunningTime="2026-01-28 18:48:58.645441746 +0000 UTC m=+806.656968521" Jan 28 18:49:00 crc kubenswrapper[4749]: I0128 18:49:00.991990 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-qgkbc" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.067458 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ck9cg"] Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.068934 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.082792 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ck9cg"] Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.166484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qn6\" (UniqueName: \"kubernetes.io/projected/29574f08-643e-4f54-8d09-6d4200568269-kube-api-access-l8qn6\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.166592 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-catalog-content\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.166687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-utilities\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.268921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-utilities\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.269046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qn6\" (UniqueName: \"kubernetes.io/projected/29574f08-643e-4f54-8d09-6d4200568269-kube-api-access-l8qn6\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.269121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-catalog-content\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.269502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-utilities\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.269648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-catalog-content\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.291075 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qn6\" (UniqueName: \"kubernetes.io/projected/29574f08-643e-4f54-8d09-6d4200568269-kube-api-access-l8qn6\") pod \"certified-operators-ck9cg\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.392010 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:01 crc kubenswrapper[4749]: I0128 18:49:01.707518 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ck9cg"] Jan 28 18:49:02 crc kubenswrapper[4749]: I0128 18:49:02.132192 4749 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Jan 28 18:49:02 crc kubenswrapper[4749]: I0128 18:49:02.132270 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="74be48ce-557a-4f08-931e-5f222458fbe3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 18:49:02 crc kubenswrapper[4749]: I0128 18:49:02.657218 4749 generic.go:334] "Generic (PLEG): container finished" podID="29574f08-643e-4f54-8d09-6d4200568269" containerID="6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565" exitCode=0 Jan 28 18:49:02 crc kubenswrapper[4749]: I0128 18:49:02.657625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck9cg" event={"ID":"29574f08-643e-4f54-8d09-6d4200568269","Type":"ContainerDied","Data":"6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565"} Jan 28 18:49:02 crc kubenswrapper[4749]: I0128 18:49:02.657657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck9cg" event={"ID":"29574f08-643e-4f54-8d09-6d4200568269","Type":"ContainerStarted","Data":"c02a904a3e73338a5b47e3cd7a48c01e6a49939bb19bc6661c1102da877ac2b3"} Jan 28 18:49:03 crc kubenswrapper[4749]: I0128 18:49:03.669500 4749 generic.go:334] "Generic (PLEG): container finished" podID="29574f08-643e-4f54-8d09-6d4200568269" containerID="b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1" exitCode=0 Jan 28 18:49:03 crc kubenswrapper[4749]: I0128 18:49:03.669870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck9cg" event={"ID":"29574f08-643e-4f54-8d09-6d4200568269","Type":"ContainerDied","Data":"b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1"} Jan 28 18:49:03 crc kubenswrapper[4749]: I0128 18:49:03.844185 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:49:03 crc kubenswrapper[4749]: I0128 18:49:03.844247 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:49:03 crc kubenswrapper[4749]: I0128 18:49:03.891618 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:49:04 crc kubenswrapper[4749]: I0128 18:49:04.679069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck9cg" event={"ID":"29574f08-643e-4f54-8d09-6d4200568269","Type":"ContainerStarted","Data":"94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79"} Jan 28 18:49:04 crc kubenswrapper[4749]: I0128 18:49:04.706733 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ck9cg" podStartSLOduration=2.190641256 podStartE2EDuration="3.706707054s" podCreationTimestamp="2026-01-28 18:49:01 +0000 UTC" firstStartedPulling="2026-01-28 18:49:02.659165477 +0000 UTC m=+810.670692252" lastFinishedPulling="2026-01-28 18:49:04.175231275 +0000 UTC m=+812.186758050" observedRunningTime="2026-01-28 18:49:04.701148098 +0000 UTC m=+812.712674883" watchObservedRunningTime="2026-01-28 18:49:04.706707054 +0000 UTC m=+812.718233829" Jan 28 18:49:04 crc kubenswrapper[4749]: I0128 18:49:04.727120 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:49:06 crc kubenswrapper[4749]: I0128 18:49:06.302106 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xskbs"] Jan 28 18:49:06 crc kubenswrapper[4749]: I0128 18:49:06.693701 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xskbs" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" containerName="registry-server" containerID="cri-o://259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02" gracePeriod=2 Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.063759 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.176731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwf2q\" (UniqueName: \"kubernetes.io/projected/895b587c-12d6-4bac-802f-6bf48ee86511-kube-api-access-mwf2q\") pod \"895b587c-12d6-4bac-802f-6bf48ee86511\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.176851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-catalog-content\") pod \"895b587c-12d6-4bac-802f-6bf48ee86511\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.176949 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-utilities\") pod \"895b587c-12d6-4bac-802f-6bf48ee86511\" (UID: \"895b587c-12d6-4bac-802f-6bf48ee86511\") " Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.178227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-utilities" (OuterVolumeSpecName: "utilities") pod "895b587c-12d6-4bac-802f-6bf48ee86511" (UID: "895b587c-12d6-4bac-802f-6bf48ee86511"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.182980 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895b587c-12d6-4bac-802f-6bf48ee86511-kube-api-access-mwf2q" (OuterVolumeSpecName: "kube-api-access-mwf2q") pod "895b587c-12d6-4bac-802f-6bf48ee86511" (UID: "895b587c-12d6-4bac-802f-6bf48ee86511"). InnerVolumeSpecName "kube-api-access-mwf2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.278612 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwf2q\" (UniqueName: \"kubernetes.io/projected/895b587c-12d6-4bac-802f-6bf48ee86511-kube-api-access-mwf2q\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.278657 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.589123 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "895b587c-12d6-4bac-802f-6bf48ee86511" (UID: "895b587c-12d6-4bac-802f-6bf48ee86511"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.685810 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895b587c-12d6-4bac-802f-6bf48ee86511-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.711593 4749 generic.go:334] "Generic (PLEG): container finished" podID="895b587c-12d6-4bac-802f-6bf48ee86511" containerID="259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02" exitCode=0 Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.711622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xskbs" event={"ID":"895b587c-12d6-4bac-802f-6bf48ee86511","Type":"ContainerDied","Data":"259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02"} Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.711666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xskbs" event={"ID":"895b587c-12d6-4bac-802f-6bf48ee86511","Type":"ContainerDied","Data":"a47628e48977a04f4db365ab99d6249d7980f0d9fa3983b6c9bce0c8e43a04e4"} Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.711699 4749 scope.go:117] "RemoveContainer" containerID="259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.711720 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xskbs" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.747812 4749 scope.go:117] "RemoveContainer" containerID="456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.751574 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xskbs"] Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.762219 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xskbs"] Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.819649 4749 scope.go:117] "RemoveContainer" containerID="ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.846842 4749 scope.go:117] "RemoveContainer" containerID="259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02" Jan 28 18:49:07 crc kubenswrapper[4749]: E0128 18:49:07.848166 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02\": container with ID starting with 259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02 not found: ID does not exist" containerID="259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.848210 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02"} err="failed to get container status \"259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02\": rpc error: code = NotFound desc = could not find container \"259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02\": container with ID starting with 259c5d048d1a606ce13e970e681f9b108f7aa8d1523d51739bf8d385b7498e02 not found: ID does not exist" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.848238 4749 scope.go:117] "RemoveContainer" containerID="456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c" Jan 28 18:49:07 crc kubenswrapper[4749]: E0128 18:49:07.848662 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c\": container with ID starting with 456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c not found: ID does not exist" containerID="456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.848692 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c"} err="failed to get container status \"456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c\": rpc error: code = NotFound desc = could not find container \"456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c\": container with ID starting with 456e5d41ec112cc97b8ef6365b4dd7532d4f211fef1b9a726d2a669b99b71f4c not found: ID does not exist" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.848708 4749 scope.go:117] "RemoveContainer" containerID="ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd" Jan 28 18:49:07 crc kubenswrapper[4749]: E0128 18:49:07.849146 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd\": container with ID starting with ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd not found: ID does not exist" containerID="ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd" Jan 28 18:49:07 crc kubenswrapper[4749]: I0128 18:49:07.849171 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd"} err="failed to get container status \"ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd\": rpc error: code = NotFound desc = could not find container \"ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd\": container with ID starting with ba5cd68096e6cb4794e62dfcb2c3fef8662dad948c6607865be5d7ff4cbef7dd not found: ID does not exist" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.702754 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lv76f"] Jan 28 18:49:08 crc kubenswrapper[4749]: E0128 18:49:08.703020 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" containerName="extract-utilities" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.703032 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" containerName="extract-utilities" Jan 28 18:49:08 crc kubenswrapper[4749]: E0128 18:49:08.703045 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" containerName="extract-content" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.703051 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" containerName="extract-content" Jan 28 18:49:08 crc kubenswrapper[4749]: E0128 18:49:08.703068 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" containerName="registry-server" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.703076 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" containerName="registry-server" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.703228 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" containerName="registry-server" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.704349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.714490 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lv76f"] Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.802871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-utilities\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.802933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmjf\" (UniqueName: \"kubernetes.io/projected/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-kube-api-access-wqmjf\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.803030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-catalog-content\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.880251 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895b587c-12d6-4bac-802f-6bf48ee86511" path="/var/lib/kubelet/pods/895b587c-12d6-4bac-802f-6bf48ee86511/volumes" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.904459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-catalog-content\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.904572 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-utilities\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.904604 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmjf\" (UniqueName: \"kubernetes.io/projected/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-kube-api-access-wqmjf\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.904979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-catalog-content\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.905044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-utilities\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:08 crc kubenswrapper[4749]: I0128 18:49:08.927282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmjf\" (UniqueName: \"kubernetes.io/projected/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-kube-api-access-wqmjf\") pod \"redhat-marketplace-lv76f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:09 crc kubenswrapper[4749]: I0128 18:49:09.021911 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:09 crc kubenswrapper[4749]: I0128 18:49:09.226857 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lv76f"] Jan 28 18:49:09 crc kubenswrapper[4749]: I0128 18:49:09.727586 4749 generic.go:334] "Generic (PLEG): container finished" podID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerID="df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903" exitCode=0 Jan 28 18:49:09 crc kubenswrapper[4749]: I0128 18:49:09.727653 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lv76f" event={"ID":"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f","Type":"ContainerDied","Data":"df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903"} Jan 28 18:49:09 crc kubenswrapper[4749]: I0128 18:49:09.727947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lv76f" event={"ID":"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f","Type":"ContainerStarted","Data":"d35521d7621178eb1aa60815f4e2ef6f561ab452abb6c949b39f3cddba930036"} Jan 28 18:49:10 crc kubenswrapper[4749]: I0128 18:49:10.736926 4749 generic.go:334] "Generic (PLEG): container finished" podID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerID="c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2" exitCode=0 Jan 28 18:49:10 crc kubenswrapper[4749]: I0128 18:49:10.737035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lv76f" event={"ID":"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f","Type":"ContainerDied","Data":"c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2"} Jan 28 18:49:11 crc kubenswrapper[4749]: I0128 18:49:11.393152 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:11 crc kubenswrapper[4749]: I0128 18:49:11.393537 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:11 crc kubenswrapper[4749]: I0128 18:49:11.434936 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:11 crc kubenswrapper[4749]: I0128 18:49:11.745991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lv76f" event={"ID":"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f","Type":"ContainerStarted","Data":"7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56"} Jan 28 18:49:11 crc kubenswrapper[4749]: I0128 18:49:11.766456 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lv76f" podStartSLOduration=2.369951479 podStartE2EDuration="3.766434769s" podCreationTimestamp="2026-01-28 18:49:08 +0000 UTC" firstStartedPulling="2026-01-28 18:49:09.730238508 +0000 UTC m=+817.741765283" lastFinishedPulling="2026-01-28 18:49:11.126721798 +0000 UTC m=+819.138248573" observedRunningTime="2026-01-28 18:49:11.76278618 +0000 UTC m=+819.774312995" watchObservedRunningTime="2026-01-28 18:49:11.766434769 +0000 UTC m=+819.777961554" Jan 28 18:49:11 crc kubenswrapper[4749]: I0128 18:49:11.799731 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:12 crc kubenswrapper[4749]: I0128 18:49:12.126065 4749 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Jan 28 18:49:12 crc kubenswrapper[4749]: I0128 18:49:12.126138 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="74be48ce-557a-4f08-931e-5f222458fbe3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 18:49:14 crc kubenswrapper[4749]: I0128 18:49:14.095957 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ck9cg"] Jan 28 18:49:14 crc kubenswrapper[4749]: I0128 18:49:14.096630 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ck9cg" podUID="29574f08-643e-4f54-8d09-6d4200568269" containerName="registry-server" containerID="cri-o://94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79" gracePeriod=2 Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.619261 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.775523 4749 generic.go:334] "Generic (PLEG): container finished" podID="29574f08-643e-4f54-8d09-6d4200568269" containerID="94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79" exitCode=0 Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.775584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck9cg" event={"ID":"29574f08-643e-4f54-8d09-6d4200568269","Type":"ContainerDied","Data":"94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79"} Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.775608 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ck9cg" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.775631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ck9cg" event={"ID":"29574f08-643e-4f54-8d09-6d4200568269","Type":"ContainerDied","Data":"c02a904a3e73338a5b47e3cd7a48c01e6a49939bb19bc6661c1102da877ac2b3"} Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.775655 4749 scope.go:117] "RemoveContainer" containerID="94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.793746 4749 scope.go:117] "RemoveContainer" containerID="b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.814433 4749 scope.go:117] "RemoveContainer" containerID="6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.815163 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-catalog-content\") pod \"29574f08-643e-4f54-8d09-6d4200568269\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.815215 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-utilities\") pod \"29574f08-643e-4f54-8d09-6d4200568269\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.815264 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8qn6\" (UniqueName: \"kubernetes.io/projected/29574f08-643e-4f54-8d09-6d4200568269-kube-api-access-l8qn6\") pod \"29574f08-643e-4f54-8d09-6d4200568269\" (UID: \"29574f08-643e-4f54-8d09-6d4200568269\") " Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.816295 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-utilities" (OuterVolumeSpecName: "utilities") pod "29574f08-643e-4f54-8d09-6d4200568269" (UID: "29574f08-643e-4f54-8d09-6d4200568269"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.821730 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29574f08-643e-4f54-8d09-6d4200568269-kube-api-access-l8qn6" (OuterVolumeSpecName: "kube-api-access-l8qn6") pod "29574f08-643e-4f54-8d09-6d4200568269" (UID: "29574f08-643e-4f54-8d09-6d4200568269"). InnerVolumeSpecName "kube-api-access-l8qn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.863497 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29574f08-643e-4f54-8d09-6d4200568269" (UID: "29574f08-643e-4f54-8d09-6d4200568269"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.870035 4749 scope.go:117] "RemoveContainer" containerID="94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79" Jan 28 18:49:15 crc kubenswrapper[4749]: E0128 18:49:15.870644 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79\": container with ID starting with 94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79 not found: ID does not exist" containerID="94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.870715 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79"} err="failed to get container status \"94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79\": rpc error: code = NotFound desc = could not find container \"94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79\": container with ID starting with 94cf7ca8cb1dcd6e9b9a0ae705d1718f1ea27d25841a084b38ee049f73b73c79 not found: ID does not exist" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.870764 4749 scope.go:117] "RemoveContainer" containerID="b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1" Jan 28 18:49:15 crc kubenswrapper[4749]: E0128 18:49:15.872109 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1\": container with ID starting with b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1 not found: ID does not exist" containerID="b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.872138 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1"} err="failed to get container status \"b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1\": rpc error: code = NotFound desc = could not find container \"b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1\": container with ID starting with b953c59e04662307db2f320b217c2ec38919f5134f8f8d2d3b606d69b34779b1 not found: ID does not exist" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.872155 4749 scope.go:117] "RemoveContainer" containerID="6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565" Jan 28 18:49:15 crc kubenswrapper[4749]: E0128 18:49:15.872408 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565\": container with ID starting with 6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565 not found: ID does not exist" containerID="6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.872459 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565"} err="failed to get container status \"6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565\": rpc error: code = NotFound desc = could not find container \"6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565\": container with ID starting with 6262d1fdb9c83f027d4d1d425378952c5f55fc03bedf11e798166c59e13dc565 not found: ID does not exist" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.918234 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.918698 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29574f08-643e-4f54-8d09-6d4200568269-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:15 crc kubenswrapper[4749]: I0128 18:49:15.918722 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8qn6\" (UniqueName: \"kubernetes.io/projected/29574f08-643e-4f54-8d09-6d4200568269-kube-api-access-l8qn6\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:16 crc kubenswrapper[4749]: I0128 18:49:16.114040 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ck9cg"] Jan 28 18:49:16 crc kubenswrapper[4749]: I0128 18:49:16.121178 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ck9cg"] Jan 28 18:49:16 crc kubenswrapper[4749]: I0128 18:49:16.880440 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29574f08-643e-4f54-8d09-6d4200568269" path="/var/lib/kubelet/pods/29574f08-643e-4f54-8d09-6d4200568269/volumes" Jan 28 18:49:19 crc kubenswrapper[4749]: I0128 18:49:19.023118 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:19 crc kubenswrapper[4749]: I0128 18:49:19.023584 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:19 crc kubenswrapper[4749]: I0128 18:49:19.068781 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:19 crc kubenswrapper[4749]: I0128 18:49:19.848594 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:19 crc kubenswrapper[4749]: I0128 18:49:19.897813 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lv76f"] Jan 28 18:49:21 crc kubenswrapper[4749]: I0128 18:49:21.819069 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lv76f" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerName="registry-server" containerID="cri-o://7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56" gracePeriod=2 Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.125699 4749 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.126066 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="74be48ce-557a-4f08-931e-5f222458fbe3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.809610 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.829315 4749 generic.go:334] "Generic (PLEG): container finished" podID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerID="7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56" exitCode=0 Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.829392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lv76f" event={"ID":"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f","Type":"ContainerDied","Data":"7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56"} Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.829423 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lv76f" event={"ID":"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f","Type":"ContainerDied","Data":"d35521d7621178eb1aa60815f4e2ef6f561ab452abb6c949b39f3cddba930036"} Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.829443 4749 scope.go:117] "RemoveContainer" containerID="7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.829585 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lv76f" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.847180 4749 scope.go:117] "RemoveContainer" containerID="c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.868873 4749 scope.go:117] "RemoveContainer" containerID="df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.902498 4749 scope.go:117] "RemoveContainer" containerID="7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56" Jan 28 18:49:22 crc kubenswrapper[4749]: E0128 18:49:22.903218 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56\": container with ID starting with 7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56 not found: ID does not exist" containerID="7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.903264 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56"} err="failed to get container status \"7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56\": rpc error: code = NotFound desc = could not find container \"7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56\": container with ID starting with 7dd5048a7cd943ce05247203777b5dfc9b87d05d92fb6ac34cb653e9373bde56 not found: ID does not exist" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.903291 4749 scope.go:117] "RemoveContainer" containerID="c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2" Jan 28 18:49:22 crc kubenswrapper[4749]: E0128 18:49:22.903693 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2\": container with ID starting with c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2 not found: ID does not exist" containerID="c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.903730 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2"} err="failed to get container status \"c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2\": rpc error: code = NotFound desc = could not find container \"c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2\": container with ID starting with c2aa11e9255a81ca9255cc7b07721445b2a6b6fb17a2765631b817db806c36f2 not found: ID does not exist" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.903747 4749 scope.go:117] "RemoveContainer" containerID="df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903" Jan 28 18:49:22 crc kubenswrapper[4749]: E0128 18:49:22.904712 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903\": container with ID starting with df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903 not found: ID does not exist" containerID="df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.904772 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903"} err="failed to get container status \"df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903\": rpc error: code = NotFound desc = could not find container \"df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903\": container with ID starting with df0e91626691eb95df7f1dba080fb1d1bff5c0ec07f951dc7aa7e68f36fda903 not found: ID does not exist" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.967582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-catalog-content\") pod \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.967707 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqmjf\" (UniqueName: \"kubernetes.io/projected/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-kube-api-access-wqmjf\") pod \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.968452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-utilities\") pod \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\" (UID: \"c4e9f114-86b6-4eb6-8e64-86d5220e4b5f\") " Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.969968 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-utilities" (OuterVolumeSpecName: "utilities") pod "c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" (UID: "c4e9f114-86b6-4eb6-8e64-86d5220e4b5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.973948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-kube-api-access-wqmjf" (OuterVolumeSpecName: "kube-api-access-wqmjf") pod "c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" (UID: "c4e9f114-86b6-4eb6-8e64-86d5220e4b5f"). InnerVolumeSpecName "kube-api-access-wqmjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:49:22 crc kubenswrapper[4749]: I0128 18:49:22.989642 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" (UID: "c4e9f114-86b6-4eb6-8e64-86d5220e4b5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:49:23 crc kubenswrapper[4749]: I0128 18:49:23.070168 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqmjf\" (UniqueName: \"kubernetes.io/projected/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-kube-api-access-wqmjf\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:23 crc kubenswrapper[4749]: I0128 18:49:23.070213 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:23 crc kubenswrapper[4749]: I0128 18:49:23.070227 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:23 crc kubenswrapper[4749]: I0128 18:49:23.158058 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lv76f"] Jan 28 18:49:23 crc kubenswrapper[4749]: I0128 18:49:23.164428 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lv76f"] Jan 28 18:49:24 crc kubenswrapper[4749]: I0128 18:49:24.879200 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" path="/var/lib/kubelet/pods/c4e9f114-86b6-4eb6-8e64-86d5220e4b5f/volumes" Jan 28 18:49:32 crc kubenswrapper[4749]: I0128 18:49:32.127810 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.990150 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-l46rz"] Jan 28 18:49:49 crc kubenswrapper[4749]: E0128 18:49:49.990941 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29574f08-643e-4f54-8d09-6d4200568269" containerName="extract-content" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.990954 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29574f08-643e-4f54-8d09-6d4200568269" containerName="extract-content" Jan 28 18:49:49 crc kubenswrapper[4749]: E0128 18:49:49.990962 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29574f08-643e-4f54-8d09-6d4200568269" containerName="registry-server" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.990968 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29574f08-643e-4f54-8d09-6d4200568269" containerName="registry-server" Jan 28 18:49:49 crc kubenswrapper[4749]: E0128 18:49:49.990977 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerName="registry-server" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.990983 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerName="registry-server" Jan 28 18:49:49 crc kubenswrapper[4749]: E0128 18:49:49.990996 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerName="extract-content" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.991001 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerName="extract-content" Jan 28 18:49:49 crc kubenswrapper[4749]: E0128 18:49:49.991022 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerName="extract-utilities" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.991028 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerName="extract-utilities" Jan 28 18:49:49 crc kubenswrapper[4749]: E0128 18:49:49.991036 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29574f08-643e-4f54-8d09-6d4200568269" containerName="extract-utilities" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.991043 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29574f08-643e-4f54-8d09-6d4200568269" containerName="extract-utilities" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.991157 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="29574f08-643e-4f54-8d09-6d4200568269" containerName="registry-server" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.991177 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e9f114-86b6-4eb6-8e64-86d5220e4b5f" containerName="registry-server" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.991699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-l46rz" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.993749 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.993873 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.994770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-sl9fv" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.995180 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Jan 28 18:49:49 crc kubenswrapper[4749]: I0128 18:49:49.995499 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.003888 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.014441 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-l46rz"] Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.064981 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-l46rz"] Jan 28 18:49:50 crc kubenswrapper[4749]: E0128 18:49:50.065634 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-998f8 metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-l46rz" podUID="49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.129969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-metrics\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130040 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-entrypoint\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-trusted-ca\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-token\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-sa-token\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-tmp\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-datadir\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config-openshift-service-cacrt\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-998f8\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-kube-api-access-998f8\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.130547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-syslog-receiver\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.231935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-syslog-receiver\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232071 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-metrics\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-entrypoint\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-trusted-ca\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232176 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-token\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232198 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-sa-token\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-tmp\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-datadir\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config-openshift-service-cacrt\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-998f8\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-kube-api-access-998f8\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.232404 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-datadir\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.233072 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config-openshift-service-cacrt\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.233546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-entrypoint\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.233667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.234421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-trusted-ca\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.238141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-token\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.241686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-tmp\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.241996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-metrics\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.242176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-syslog-receiver\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.252958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-998f8\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-kube-api-access-998f8\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:50 crc kubenswrapper[4749]: I0128 18:49:50.253271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-sa-token\") pod \"collector-l46rz\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " pod="openshift-logging/collector-l46rz" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.020846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-l46rz" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.031505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-l46rz" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145638 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-token\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145711 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-datadir\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-entrypoint\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145814 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-metrics\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145868 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-trusted-ca\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145890 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-998f8\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-kube-api-access-998f8\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145922 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-sa-token\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145948 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.145986 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-tmp\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.146043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config-openshift-service-cacrt\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.146078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-syslog-receiver\") pod \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\" (UID: \"49a852f2-3c3c-4041-a2bc-1a9ca2d930b9\") " Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.146381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-datadir" (OuterVolumeSpecName: "datadir") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.146893 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.146910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.146977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config" (OuterVolumeSpecName: "config") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.147197 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.150040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-kube-api-access-998f8" (OuterVolumeSpecName: "kube-api-access-998f8") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "kube-api-access-998f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.150142 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-sa-token" (OuterVolumeSpecName: "sa-token") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.150750 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-metrics" (OuterVolumeSpecName: "metrics") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.151116 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-token" (OuterVolumeSpecName: "collector-token") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.151260 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.151550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-tmp" (OuterVolumeSpecName: "tmp") pod "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" (UID: "49a852f2-3c3c-4041-a2bc-1a9ca2d930b9"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.247939 4749 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-entrypoint\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248018 4749 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-metrics\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248032 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248044 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-998f8\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-kube-api-access-998f8\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248058 4749 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248070 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248079 4749 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-tmp\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248091 4749 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248103 4749 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248117 4749 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-collector-token\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:51 crc kubenswrapper[4749]: I0128 18:49:51.248129 4749 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9-datadir\") on node \"crc\" DevicePath \"\"" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.027272 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-l46rz" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.076554 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-l46rz"] Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.081580 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-l46rz"] Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.086712 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-6gh4z"] Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.087560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.091715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.091853 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.091944 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.092056 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.093639 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-sl9fv" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.101865 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.103026 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-6gh4z"] Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-trusted-ca\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163206 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbbht\" (UniqueName: \"kubernetes.io/projected/52382cc0-233b-441a-b56f-24c303211d2a-kube-api-access-bbbht\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-collector-token\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163247 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-config-openshift-service-cacrt\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/52382cc0-233b-441a-b56f-24c303211d2a-datadir\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/52382cc0-233b-441a-b56f-24c303211d2a-sa-token\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163488 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52382cc0-233b-441a-b56f-24c303211d2a-tmp\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163516 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-config\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-entrypoint\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-collector-syslog-receiver\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.163710 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-metrics\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.264971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-config\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-entrypoint\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-collector-syslog-receiver\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-metrics\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-trusted-ca\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbbht\" (UniqueName: \"kubernetes.io/projected/52382cc0-233b-441a-b56f-24c303211d2a-kube-api-access-bbbht\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265208 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-collector-token\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-config-openshift-service-cacrt\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/52382cc0-233b-441a-b56f-24c303211d2a-datadir\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/52382cc0-233b-441a-b56f-24c303211d2a-sa-token\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52382cc0-233b-441a-b56f-24c303211d2a-tmp\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.265466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/52382cc0-233b-441a-b56f-24c303211d2a-datadir\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.266105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-config-openshift-service-cacrt\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.266148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-entrypoint\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.266115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-config\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.266209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52382cc0-233b-441a-b56f-24c303211d2a-trusted-ca\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.268287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-metrics\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.268418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-collector-syslog-receiver\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.268679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/52382cc0-233b-441a-b56f-24c303211d2a-collector-token\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.273106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52382cc0-233b-441a-b56f-24c303211d2a-tmp\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.285432 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbbht\" (UniqueName: \"kubernetes.io/projected/52382cc0-233b-441a-b56f-24c303211d2a-kube-api-access-bbbht\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.285684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/52382cc0-233b-441a-b56f-24c303211d2a-sa-token\") pod \"collector-6gh4z\" (UID: \"52382cc0-233b-441a-b56f-24c303211d2a\") " pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.406173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-6gh4z" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.879971 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a852f2-3c3c-4041-a2bc-1a9ca2d930b9" path="/var/lib/kubelet/pods/49a852f2-3c3c-4041-a2bc-1a9ca2d930b9/volumes" Jan 28 18:49:52 crc kubenswrapper[4749]: I0128 18:49:52.926478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-6gh4z"] Jan 28 18:49:53 crc kubenswrapper[4749]: I0128 18:49:53.035648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-6gh4z" event={"ID":"52382cc0-233b-441a-b56f-24c303211d2a","Type":"ContainerStarted","Data":"40fef6c335c74cc5981f9e4da0bb3799f2c3b967d9bcd7d6dca4dc4121575e15"} Jan 28 18:50:07 crc kubenswrapper[4749]: I0128 18:50:07.131530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-6gh4z" event={"ID":"52382cc0-233b-441a-b56f-24c303211d2a","Type":"ContainerStarted","Data":"3d1141e163d60de8243c872a0a99e4bd0f4a1d26973e15eb5bea45f37e744b7f"} Jan 28 18:50:07 crc kubenswrapper[4749]: I0128 18:50:07.173238 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-6gh4z" podStartSLOduration=1.625456796 podStartE2EDuration="15.1731983s" podCreationTimestamp="2026-01-28 18:49:52 +0000 UTC" firstStartedPulling="2026-01-28 18:49:52.93994381 +0000 UTC m=+860.951470585" lastFinishedPulling="2026-01-28 18:50:06.487685304 +0000 UTC m=+874.499212089" observedRunningTime="2026-01-28 18:50:07.165574235 +0000 UTC m=+875.177101030" watchObservedRunningTime="2026-01-28 18:50:07.1731983 +0000 UTC m=+875.184725085" Jan 28 18:50:27 crc kubenswrapper[4749]: I0128 18:50:27.467732 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:50:27 crc kubenswrapper[4749]: I0128 18:50:27.468926 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:50:30 crc kubenswrapper[4749]: I0128 18:50:30.851636 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r"] Jan 28 18:50:30 crc kubenswrapper[4749]: I0128 18:50:30.853511 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:30 crc kubenswrapper[4749]: I0128 18:50:30.859778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 18:50:30 crc kubenswrapper[4749]: I0128 18:50:30.883221 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r"] Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.040496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.040551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqt74\" (UniqueName: \"kubernetes.io/projected/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-kube-api-access-kqt74\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.040672 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.141845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.142156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqt74\" (UniqueName: \"kubernetes.io/projected/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-kube-api-access-kqt74\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.142211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.142409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.142715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.172651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqt74\" (UniqueName: \"kubernetes.io/projected/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-kube-api-access-kqt74\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.469742 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:31 crc kubenswrapper[4749]: I0128 18:50:31.979819 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r"] Jan 28 18:50:31 crc kubenswrapper[4749]: W0128 18:50:31.996118 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc6d7f14_257e_41dd_9e34_cdfaf74aa9c2.slice/crio-f668d7c772d239223182fc28a41f3d791569e6159cbd4243b6b5e22743d76b55 WatchSource:0}: Error finding container f668d7c772d239223182fc28a41f3d791569e6159cbd4243b6b5e22743d76b55: Status 404 returned error can't find the container with id f668d7c772d239223182fc28a41f3d791569e6159cbd4243b6b5e22743d76b55 Jan 28 18:50:32 crc kubenswrapper[4749]: I0128 18:50:32.336640 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" event={"ID":"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2","Type":"ContainerStarted","Data":"f668d7c772d239223182fc28a41f3d791569e6159cbd4243b6b5e22743d76b55"} Jan 28 18:50:35 crc kubenswrapper[4749]: I0128 18:50:35.360396 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerID="efa1f2d5dc5c49dba2a80d7b574311ae123ad03ad02d38609690f617a258369b" exitCode=0 Jan 28 18:50:35 crc kubenswrapper[4749]: I0128 18:50:35.360481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" event={"ID":"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2","Type":"ContainerDied","Data":"efa1f2d5dc5c49dba2a80d7b574311ae123ad03ad02d38609690f617a258369b"} Jan 28 18:50:39 crc kubenswrapper[4749]: I0128 18:50:39.401155 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerID="04b5ec57183644227731c1234f8fa7a9a8734369189aefeeb507079249ae200c" exitCode=0 Jan 28 18:50:39 crc kubenswrapper[4749]: I0128 18:50:39.401665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" event={"ID":"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2","Type":"ContainerDied","Data":"04b5ec57183644227731c1234f8fa7a9a8734369189aefeeb507079249ae200c"} Jan 28 18:50:40 crc kubenswrapper[4749]: I0128 18:50:40.409448 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerID="f34e2f970ddc31dd7f41a01d370f785f906f06f26b88a3403cd81d6db55472ea" exitCode=0 Jan 28 18:50:40 crc kubenswrapper[4749]: I0128 18:50:40.409502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" event={"ID":"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2","Type":"ContainerDied","Data":"f34e2f970ddc31dd7f41a01d370f785f906f06f26b88a3403cd81d6db55472ea"} Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.686252 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.744393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqt74\" (UniqueName: \"kubernetes.io/projected/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-kube-api-access-kqt74\") pod \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.744537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-bundle\") pod \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.744579 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-util\") pod \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\" (UID: \"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2\") " Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.750714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-bundle" (OuterVolumeSpecName: "bundle") pod "fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" (UID: "fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.755192 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-util" (OuterVolumeSpecName: "util") pod "fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" (UID: "fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.756680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-kube-api-access-kqt74" (OuterVolumeSpecName: "kube-api-access-kqt74") pod "fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" (UID: "fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2"). InnerVolumeSpecName "kube-api-access-kqt74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.845753 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.845785 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-util\") on node \"crc\" DevicePath \"\"" Jan 28 18:50:41 crc kubenswrapper[4749]: I0128 18:50:41.845794 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqt74\" (UniqueName: \"kubernetes.io/projected/fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2-kube-api-access-kqt74\") on node \"crc\" DevicePath \"\"" Jan 28 18:50:42 crc kubenswrapper[4749]: I0128 18:50:42.428210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" event={"ID":"fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2","Type":"ContainerDied","Data":"f668d7c772d239223182fc28a41f3d791569e6159cbd4243b6b5e22743d76b55"} Jan 28 18:50:42 crc kubenswrapper[4749]: I0128 18:50:42.428275 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f668d7c772d239223182fc28a41f3d791569e6159cbd4243b6b5e22743d76b55" Jan 28 18:50:42 crc kubenswrapper[4749]: I0128 18:50:42.428314 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.455660 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-kb6l2"] Jan 28 18:50:47 crc kubenswrapper[4749]: E0128 18:50:47.456673 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerName="util" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.456690 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerName="util" Jan 28 18:50:47 crc kubenswrapper[4749]: E0128 18:50:47.456705 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerName="pull" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.456712 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerName="pull" Jan 28 18:50:47 crc kubenswrapper[4749]: E0128 18:50:47.456734 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerName="extract" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.456742 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerName="extract" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.456921 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2" containerName="extract" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.457605 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-kb6l2" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.460579 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-d6b57" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.460606 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.460738 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.466356 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-kb6l2"] Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.542534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfx24\" (UniqueName: \"kubernetes.io/projected/ed1a865e-60b5-4988-aece-9badf4d94f37-kube-api-access-sfx24\") pod \"nmstate-operator-646758c888-kb6l2\" (UID: \"ed1a865e-60b5-4988-aece-9badf4d94f37\") " pod="openshift-nmstate/nmstate-operator-646758c888-kb6l2" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.644019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfx24\" (UniqueName: \"kubernetes.io/projected/ed1a865e-60b5-4988-aece-9badf4d94f37-kube-api-access-sfx24\") pod \"nmstate-operator-646758c888-kb6l2\" (UID: \"ed1a865e-60b5-4988-aece-9badf4d94f37\") " pod="openshift-nmstate/nmstate-operator-646758c888-kb6l2" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.673983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfx24\" (UniqueName: \"kubernetes.io/projected/ed1a865e-60b5-4988-aece-9badf4d94f37-kube-api-access-sfx24\") pod \"nmstate-operator-646758c888-kb6l2\" (UID: \"ed1a865e-60b5-4988-aece-9badf4d94f37\") " pod="openshift-nmstate/nmstate-operator-646758c888-kb6l2" Jan 28 18:50:47 crc kubenswrapper[4749]: I0128 18:50:47.775409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-kb6l2" Jan 28 18:50:48 crc kubenswrapper[4749]: I0128 18:50:48.202891 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-kb6l2"] Jan 28 18:50:48 crc kubenswrapper[4749]: I0128 18:50:48.479442 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-kb6l2" event={"ID":"ed1a865e-60b5-4988-aece-9badf4d94f37","Type":"ContainerStarted","Data":"7cff4c7ba5d5344803f048020e8a50a95097c9d4b83d2852d63f17a288b5b940"} Jan 28 18:50:50 crc kubenswrapper[4749]: I0128 18:50:50.494646 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-kb6l2" event={"ID":"ed1a865e-60b5-4988-aece-9badf4d94f37","Type":"ContainerStarted","Data":"17f50132cee5483f2a8f7e293ec7a0ddb1af898db07d807ef62383de6c85d150"} Jan 28 18:50:50 crc kubenswrapper[4749]: I0128 18:50:50.513425 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-kb6l2" podStartSLOduration=1.462805372 podStartE2EDuration="3.513405135s" podCreationTimestamp="2026-01-28 18:50:47 +0000 UTC" firstStartedPulling="2026-01-28 18:50:48.211977336 +0000 UTC m=+916.223504111" lastFinishedPulling="2026-01-28 18:50:50.262577099 +0000 UTC m=+918.274103874" observedRunningTime="2026-01-28 18:50:50.509832179 +0000 UTC m=+918.521358974" watchObservedRunningTime="2026-01-28 18:50:50.513405135 +0000 UTC m=+918.524931900" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.170256 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-cd7b4"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.172250 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.174723 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-56n7f" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.183174 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-cd7b4"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.191864 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.193066 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.195198 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.200961 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.208920 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zfv4r"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.210150 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.327309 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj4gp\" (UniqueName: \"kubernetes.io/projected/89bc2633-4343-4bc9-9738-38c95e83dccd-kube-api-access-bj4gp\") pod \"nmstate-metrics-54757c584b-cd7b4\" (UID: \"89bc2633-4343-4bc9-9738-38c95e83dccd\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.327423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed9849a0-0d35-4788-8987-a08e79096f9b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lvpx5\" (UID: \"ed9849a0-0d35-4788-8987-a08e79096f9b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.327688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsfc\" (UniqueName: \"kubernetes.io/projected/ed9849a0-0d35-4788-8987-a08e79096f9b-kube-api-access-7jsfc\") pod \"nmstate-webhook-8474b5b9d8-lvpx5\" (UID: \"ed9849a0-0d35-4788-8987-a08e79096f9b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.327760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-nmstate-lock\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.327882 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-dbus-socket\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.327936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxsr8\" (UniqueName: \"kubernetes.io/projected/61da7503-b316-46f6-8ea1-2ee05d142a2f-kube-api-access-mxsr8\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.328232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-ovs-socket\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.349030 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.350450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.354991 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jsm9h" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.355298 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.355683 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.364636 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.430440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-ovs-socket\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.430541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj4gp\" (UniqueName: \"kubernetes.io/projected/89bc2633-4343-4bc9-9738-38c95e83dccd-kube-api-access-bj4gp\") pod \"nmstate-metrics-54757c584b-cd7b4\" (UID: \"89bc2633-4343-4bc9-9738-38c95e83dccd\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.430580 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-ovs-socket\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.430623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed9849a0-0d35-4788-8987-a08e79096f9b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lvpx5\" (UID: \"ed9849a0-0d35-4788-8987-a08e79096f9b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.430696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsfc\" (UniqueName: \"kubernetes.io/projected/ed9849a0-0d35-4788-8987-a08e79096f9b-kube-api-access-7jsfc\") pod \"nmstate-webhook-8474b5b9d8-lvpx5\" (UID: \"ed9849a0-0d35-4788-8987-a08e79096f9b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.430722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-nmstate-lock\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.430777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-dbus-socket\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.430805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxsr8\" (UniqueName: \"kubernetes.io/projected/61da7503-b316-46f6-8ea1-2ee05d142a2f-kube-api-access-mxsr8\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.431038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-nmstate-lock\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.431734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61da7503-b316-46f6-8ea1-2ee05d142a2f-dbus-socket\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.437179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed9849a0-0d35-4788-8987-a08e79096f9b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lvpx5\" (UID: \"ed9849a0-0d35-4788-8987-a08e79096f9b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.453459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsfc\" (UniqueName: \"kubernetes.io/projected/ed9849a0-0d35-4788-8987-a08e79096f9b-kube-api-access-7jsfc\") pod \"nmstate-webhook-8474b5b9d8-lvpx5\" (UID: \"ed9849a0-0d35-4788-8987-a08e79096f9b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.453818 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxsr8\" (UniqueName: \"kubernetes.io/projected/61da7503-b316-46f6-8ea1-2ee05d142a2f-kube-api-access-mxsr8\") pod \"nmstate-handler-zfv4r\" (UID: \"61da7503-b316-46f6-8ea1-2ee05d142a2f\") " pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.454853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj4gp\" (UniqueName: \"kubernetes.io/projected/89bc2633-4343-4bc9-9738-38c95e83dccd-kube-api-access-bj4gp\") pod \"nmstate-metrics-54757c584b-cd7b4\" (UID: \"89bc2633-4343-4bc9-9738-38c95e83dccd\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.532226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c06f0b6-f78b-44cb-8305-c4ac809711e8-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.532386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhzw8\" (UniqueName: \"kubernetes.io/projected/5c06f0b6-f78b-44cb-8305-c4ac809711e8-kube-api-access-hhzw8\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.532626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5c06f0b6-f78b-44cb-8305-c4ac809711e8-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.567637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.583778 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-ff46b7d8d-w6bmg"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.585011 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.588062 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.600546 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ff46b7d8d-w6bmg"] Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.601054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.635465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5c06f0b6-f78b-44cb-8305-c4ac809711e8-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.635535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c06f0b6-f78b-44cb-8305-c4ac809711e8-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.635576 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhzw8\" (UniqueName: \"kubernetes.io/projected/5c06f0b6-f78b-44cb-8305-c4ac809711e8-kube-api-access-hhzw8\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: E0128 18:50:52.636257 4749 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 28 18:50:52 crc kubenswrapper[4749]: E0128 18:50:52.636392 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c06f0b6-f78b-44cb-8305-c4ac809711e8-plugin-serving-cert podName:5c06f0b6-f78b-44cb-8305-c4ac809711e8 nodeName:}" failed. No retries permitted until 2026-01-28 18:50:53.13637221 +0000 UTC m=+921.147898985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/5c06f0b6-f78b-44cb-8305-c4ac809711e8-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-mbmx8" (UID: "5c06f0b6-f78b-44cb-8305-c4ac809711e8") : secret "plugin-serving-cert" not found Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.636905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5c06f0b6-f78b-44cb-8305-c4ac809711e8-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.657903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhzw8\" (UniqueName: \"kubernetes.io/projected/5c06f0b6-f78b-44cb-8305-c4ac809711e8-kube-api-access-hhzw8\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.738195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-serving-cert\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.738250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-service-ca\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.738282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rh2\" (UniqueName: \"kubernetes.io/projected/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-kube-api-access-p6rh2\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.738304 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-oauth-config\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.738344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-config\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.738432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-trusted-ca-bundle\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.738458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-oauth-serving-cert\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.840466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-service-ca\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.840845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rh2\" (UniqueName: \"kubernetes.io/projected/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-kube-api-access-p6rh2\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.840908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-oauth-config\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.840962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-config\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.841073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-trusted-ca-bundle\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.841096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-oauth-serving-cert\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.841161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-serving-cert\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.841539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-service-ca\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.841782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-config\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.843098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-oauth-serving-cert\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.843797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-trusted-ca-bundle\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.847501 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-serving-cert\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.847762 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-oauth-config\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.864654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rh2\" (UniqueName: \"kubernetes.io/projected/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-kube-api-access-p6rh2\") pod \"console-ff46b7d8d-w6bmg\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.927115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:50:52 crc kubenswrapper[4749]: I0128 18:50:52.991677 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-cd7b4"] Jan 28 18:50:52 crc kubenswrapper[4749]: W0128 18:50:52.991817 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89bc2633_4343_4bc9_9738_38c95e83dccd.slice/crio-8a5681c79e0ef221ad6d653c9947ad7f27156997b3820833b4638df1f1c054f1 WatchSource:0}: Error finding container 8a5681c79e0ef221ad6d653c9947ad7f27156997b3820833b4638df1f1c054f1: Status 404 returned error can't find the container with id 8a5681c79e0ef221ad6d653c9947ad7f27156997b3820833b4638df1f1c054f1 Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.148600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c06f0b6-f78b-44cb-8305-c4ac809711e8-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.152941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c06f0b6-f78b-44cb-8305-c4ac809711e8-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mbmx8\" (UID: \"5c06f0b6-f78b-44cb-8305-c4ac809711e8\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.277116 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.317688 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5"] Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.507798 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8"] Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.520677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zfv4r" event={"ID":"61da7503-b316-46f6-8ea1-2ee05d142a2f","Type":"ContainerStarted","Data":"1d52fdaec87efb9b2f0f9e0e82a348e8f3393fc4edfe66d945b3f2f90023da1c"} Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.521982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" event={"ID":"89bc2633-4343-4bc9-9738-38c95e83dccd","Type":"ContainerStarted","Data":"8a5681c79e0ef221ad6d653c9947ad7f27156997b3820833b4638df1f1c054f1"} Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.523115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" event={"ID":"ed9849a0-0d35-4788-8987-a08e79096f9b","Type":"ContainerStarted","Data":"1eb566ff5e16db80630ae64e14bd8cb55f855f1f4e8f447e7d0cd49440c29b5a"} Jan 28 18:50:53 crc kubenswrapper[4749]: I0128 18:50:53.544421 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ff46b7d8d-w6bmg"] Jan 28 18:50:53 crc kubenswrapper[4749]: W0128 18:50:53.554390 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e59c97f_819b_46d8_98a2_7a491b0b7c9e.slice/crio-58829424ca0b0a1ecfb9cf838ba3b4c868e12368852dcfdc7d7bbdd302d4b192 WatchSource:0}: Error finding container 58829424ca0b0a1ecfb9cf838ba3b4c868e12368852dcfdc7d7bbdd302d4b192: Status 404 returned error can't find the container with id 58829424ca0b0a1ecfb9cf838ba3b4c868e12368852dcfdc7d7bbdd302d4b192 Jan 28 18:50:54 crc kubenswrapper[4749]: I0128 18:50:54.544803 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff46b7d8d-w6bmg" event={"ID":"6e59c97f-819b-46d8-98a2-7a491b0b7c9e","Type":"ContainerStarted","Data":"5b0bbb45ed106557a40b2ef270f22245db586d4486855e17b5f77aeef8aa1046"} Jan 28 18:50:54 crc kubenswrapper[4749]: I0128 18:50:54.545193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff46b7d8d-w6bmg" event={"ID":"6e59c97f-819b-46d8-98a2-7a491b0b7c9e","Type":"ContainerStarted","Data":"58829424ca0b0a1ecfb9cf838ba3b4c868e12368852dcfdc7d7bbdd302d4b192"} Jan 28 18:50:54 crc kubenswrapper[4749]: I0128 18:50:54.548007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" event={"ID":"5c06f0b6-f78b-44cb-8305-c4ac809711e8","Type":"ContainerStarted","Data":"a1be33158e2f84e21ef0a5e23410ac2833b83ecebc71f8e9ebcb9166fb0ff032"} Jan 28 18:50:54 crc kubenswrapper[4749]: I0128 18:50:54.571264 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-ff46b7d8d-w6bmg" podStartSLOduration=2.5712439590000002 podStartE2EDuration="2.571243959s" podCreationTimestamp="2026-01-28 18:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:50:54.565131881 +0000 UTC m=+922.576658676" watchObservedRunningTime="2026-01-28 18:50:54.571243959 +0000 UTC m=+922.582770734" Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.468658 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.469493 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.574705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" event={"ID":"ed9849a0-0d35-4788-8987-a08e79096f9b","Type":"ContainerStarted","Data":"7621876677ffa0d28db211f7ea131c6d817300e3097465f4db22251d2e17b893"} Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.574993 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.577776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zfv4r" event={"ID":"61da7503-b316-46f6-8ea1-2ee05d142a2f","Type":"ContainerStarted","Data":"d739903771154dc4defbfa7b89933140deae4e9ef9a399cb7805d051ce510c6b"} Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.578104 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.579381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" event={"ID":"89bc2633-4343-4bc9-9738-38c95e83dccd","Type":"ContainerStarted","Data":"085effdfa2711c4ea7be20ca252b944316468bd0a00f1950a39062707cd3271e"} Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.580715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" event={"ID":"5c06f0b6-f78b-44cb-8305-c4ac809711e8","Type":"ContainerStarted","Data":"7a65d9b8eb42a2c572c55d9ae1f282f8e9467bd7b3ab7f156e660926cf3e760b"} Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.601719 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" podStartSLOduration=2.427732194 podStartE2EDuration="5.601701808s" podCreationTimestamp="2026-01-28 18:50:52 +0000 UTC" firstStartedPulling="2026-01-28 18:50:53.34188242 +0000 UTC m=+921.353409195" lastFinishedPulling="2026-01-28 18:50:56.515852034 +0000 UTC m=+924.527378809" observedRunningTime="2026-01-28 18:50:57.589518202 +0000 UTC m=+925.601044997" watchObservedRunningTime="2026-01-28 18:50:57.601701808 +0000 UTC m=+925.613228583" Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.621125 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zfv4r" podStartSLOduration=1.821743654 podStartE2EDuration="5.621023676s" podCreationTimestamp="2026-01-28 18:50:52 +0000 UTC" firstStartedPulling="2026-01-28 18:50:52.708660321 +0000 UTC m=+920.720187096" lastFinishedPulling="2026-01-28 18:50:56.507940343 +0000 UTC m=+924.519467118" observedRunningTime="2026-01-28 18:50:57.612808917 +0000 UTC m=+925.624335692" watchObservedRunningTime="2026-01-28 18:50:57.621023676 +0000 UTC m=+925.632550451" Jan 28 18:50:57 crc kubenswrapper[4749]: I0128 18:50:57.635963 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mbmx8" podStartSLOduration=2.645034687 podStartE2EDuration="5.635926426s" podCreationTimestamp="2026-01-28 18:50:52 +0000 UTC" firstStartedPulling="2026-01-28 18:50:53.515509146 +0000 UTC m=+921.527035921" lastFinishedPulling="2026-01-28 18:50:56.506400885 +0000 UTC m=+924.517927660" observedRunningTime="2026-01-28 18:50:57.626856217 +0000 UTC m=+925.638383002" watchObservedRunningTime="2026-01-28 18:50:57.635926426 +0000 UTC m=+925.647453191" Jan 28 18:50:59 crc kubenswrapper[4749]: I0128 18:50:59.596512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" event={"ID":"89bc2633-4343-4bc9-9738-38c95e83dccd","Type":"ContainerStarted","Data":"9f36aff0a42f39cbf301e99f52290087d3ca1284931ca527c6d3915aa8d64aa5"} Jan 28 18:50:59 crc kubenswrapper[4749]: I0128 18:50:59.619053 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-cd7b4" podStartSLOduration=1.239201052 podStartE2EDuration="7.619025404s" podCreationTimestamp="2026-01-28 18:50:52 +0000 UTC" firstStartedPulling="2026-01-28 18:50:53.004118758 +0000 UTC m=+921.015645543" lastFinishedPulling="2026-01-28 18:50:59.38394312 +0000 UTC m=+927.395469895" observedRunningTime="2026-01-28 18:50:59.612562267 +0000 UTC m=+927.624089052" watchObservedRunningTime="2026-01-28 18:50:59.619025404 +0000 UTC m=+927.630552179" Jan 28 18:51:02 crc kubenswrapper[4749]: I0128 18:51:02.626043 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zfv4r" Jan 28 18:51:02 crc kubenswrapper[4749]: I0128 18:51:02.927291 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:51:02 crc kubenswrapper[4749]: I0128 18:51:02.927393 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:51:02 crc kubenswrapper[4749]: I0128 18:51:02.932558 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:51:03 crc kubenswrapper[4749]: I0128 18:51:03.622722 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:51:03 crc kubenswrapper[4749]: I0128 18:51:03.685876 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54b4fd76b5-v7gv6"] Jan 28 18:51:12 crc kubenswrapper[4749]: I0128 18:51:12.594471 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lvpx5" Jan 28 18:51:27 crc kubenswrapper[4749]: I0128 18:51:27.467716 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:51:27 crc kubenswrapper[4749]: I0128 18:51:27.468619 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:51:27 crc kubenswrapper[4749]: I0128 18:51:27.468667 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:51:27 crc kubenswrapper[4749]: I0128 18:51:27.469981 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e5b313040fcc5d2f4a0e0713dba32c28c08f86a27e8cecbbc5d364a34a7eb3e"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 18:51:27 crc kubenswrapper[4749]: I0128 18:51:27.470041 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://4e5b313040fcc5d2f4a0e0713dba32c28c08f86a27e8cecbbc5d364a34a7eb3e" gracePeriod=600 Jan 28 18:51:27 crc kubenswrapper[4749]: I0128 18:51:27.812157 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="4e5b313040fcc5d2f4a0e0713dba32c28c08f86a27e8cecbbc5d364a34a7eb3e" exitCode=0 Jan 28 18:51:27 crc kubenswrapper[4749]: I0128 18:51:27.812249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"4e5b313040fcc5d2f4a0e0713dba32c28c08f86a27e8cecbbc5d364a34a7eb3e"} Jan 28 18:51:27 crc kubenswrapper[4749]: I0128 18:51:27.812520 4749 scope.go:117] "RemoveContainer" containerID="04f2d6bfc0ed8ee6f0293f2bd9184234285368c2476cf72bb7d6b248a665883d" Jan 28 18:51:28 crc kubenswrapper[4749]: I0128 18:51:28.730304 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-54b4fd76b5-v7gv6" podUID="ec672680-fd53-4817-84ef-86edc8d76b8d" containerName="console" containerID="cri-o://a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b" gracePeriod=15 Jan 28 18:51:28 crc kubenswrapper[4749]: I0128 18:51:28.821956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"ffd739e178035a0a80263ddfa883436d23618669a6ffd6b8554f99da5a12189b"} Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.184070 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54b4fd76b5-v7gv6_ec672680-fd53-4817-84ef-86edc8d76b8d/console/0.log" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.184441 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.305779 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-oauth-config\") pod \"ec672680-fd53-4817-84ef-86edc8d76b8d\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.306157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-service-ca\") pod \"ec672680-fd53-4817-84ef-86edc8d76b8d\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.306308 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-trusted-ca-bundle\") pod \"ec672680-fd53-4817-84ef-86edc8d76b8d\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.307428 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-service-ca" (OuterVolumeSpecName: "service-ca") pod "ec672680-fd53-4817-84ef-86edc8d76b8d" (UID: "ec672680-fd53-4817-84ef-86edc8d76b8d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.307443 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ec672680-fd53-4817-84ef-86edc8d76b8d" (UID: "ec672680-fd53-4817-84ef-86edc8d76b8d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.307492 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-console-config\") pod \"ec672680-fd53-4817-84ef-86edc8d76b8d\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.307575 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-oauth-serving-cert\") pod \"ec672680-fd53-4817-84ef-86edc8d76b8d\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.307665 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-serving-cert\") pod \"ec672680-fd53-4817-84ef-86edc8d76b8d\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.307724 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgbn8\" (UniqueName: \"kubernetes.io/projected/ec672680-fd53-4817-84ef-86edc8d76b8d-kube-api-access-sgbn8\") pod \"ec672680-fd53-4817-84ef-86edc8d76b8d\" (UID: \"ec672680-fd53-4817-84ef-86edc8d76b8d\") " Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.307795 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-console-config" (OuterVolumeSpecName: "console-config") pod "ec672680-fd53-4817-84ef-86edc8d76b8d" (UID: "ec672680-fd53-4817-84ef-86edc8d76b8d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.308070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ec672680-fd53-4817-84ef-86edc8d76b8d" (UID: "ec672680-fd53-4817-84ef-86edc8d76b8d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.308433 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.308455 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.308472 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.308483 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec672680-fd53-4817-84ef-86edc8d76b8d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.317660 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ec672680-fd53-4817-84ef-86edc8d76b8d" (UID: "ec672680-fd53-4817-84ef-86edc8d76b8d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.317845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec672680-fd53-4817-84ef-86edc8d76b8d-kube-api-access-sgbn8" (OuterVolumeSpecName: "kube-api-access-sgbn8") pod "ec672680-fd53-4817-84ef-86edc8d76b8d" (UID: "ec672680-fd53-4817-84ef-86edc8d76b8d"). InnerVolumeSpecName "kube-api-access-sgbn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.317905 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ec672680-fd53-4817-84ef-86edc8d76b8d" (UID: "ec672680-fd53-4817-84ef-86edc8d76b8d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.392995 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm"] Jan 28 18:51:29 crc kubenswrapper[4749]: E0128 18:51:29.393413 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec672680-fd53-4817-84ef-86edc8d76b8d" containerName="console" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.393433 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec672680-fd53-4817-84ef-86edc8d76b8d" containerName="console" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.393602 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec672680-fd53-4817-84ef-86edc8d76b8d" containerName="console" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.394953 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.398118 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.403766 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm"] Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.419796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg69r\" (UniqueName: \"kubernetes.io/projected/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-kube-api-access-mg69r\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.419872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.419932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.420056 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.420071 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec672680-fd53-4817-84ef-86edc8d76b8d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.420083 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgbn8\" (UniqueName: \"kubernetes.io/projected/ec672680-fd53-4817-84ef-86edc8d76b8d-kube-api-access-sgbn8\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.521501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg69r\" (UniqueName: \"kubernetes.io/projected/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-kube-api-access-mg69r\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.521580 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.521644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.522191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.522701 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.563495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg69r\" (UniqueName: \"kubernetes.io/projected/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-kube-api-access-mg69r\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.718288 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.830787 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54b4fd76b5-v7gv6_ec672680-fd53-4817-84ef-86edc8d76b8d/console/0.log" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.831091 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec672680-fd53-4817-84ef-86edc8d76b8d" containerID="a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b" exitCode=2 Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.831476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b4fd76b5-v7gv6" event={"ID":"ec672680-fd53-4817-84ef-86edc8d76b8d","Type":"ContainerDied","Data":"a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b"} Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.831542 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b4fd76b5-v7gv6" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.831577 4749 scope.go:117] "RemoveContainer" containerID="a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.831561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b4fd76b5-v7gv6" event={"ID":"ec672680-fd53-4817-84ef-86edc8d76b8d","Type":"ContainerDied","Data":"1f364c398eb6c0daa74f7ad9a03061c65e27e2faaac7e358319fbc3bfd759c42"} Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.911959 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54b4fd76b5-v7gv6"] Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.919696 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54b4fd76b5-v7gv6"] Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.924425 4749 scope.go:117] "RemoveContainer" containerID="a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b" Jan 28 18:51:29 crc kubenswrapper[4749]: E0128 18:51:29.926693 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b\": container with ID starting with a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b not found: ID does not exist" containerID="a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.926753 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b"} err="failed to get container status \"a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b\": rpc error: code = NotFound desc = could not find container \"a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b\": container with ID starting with a5cfc00932510b59ea5ac8739a75ffd7477c5c574ce954c4c76a875a861c837b not found: ID does not exist" Jan 28 18:51:29 crc kubenswrapper[4749]: I0128 18:51:29.938230 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm"] Jan 28 18:51:29 crc kubenswrapper[4749]: W0128 18:51:29.945446 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9df443ae_e4fd_4f9c_ba5f_99668ef3e451.slice/crio-f58e095b04cba7bbca0333703b210fbb1c080870f4b7ac7565fc6b52d65c21a6 WatchSource:0}: Error finding container f58e095b04cba7bbca0333703b210fbb1c080870f4b7ac7565fc6b52d65c21a6: Status 404 returned error can't find the container with id f58e095b04cba7bbca0333703b210fbb1c080870f4b7ac7565fc6b52d65c21a6 Jan 28 18:51:30 crc kubenswrapper[4749]: I0128 18:51:30.839092 4749 generic.go:334] "Generic (PLEG): container finished" podID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerID="cc78b2caa66755624ff9890a416124a678e9835bb3a092701d4a9324e38e921a" exitCode=0 Jan 28 18:51:30 crc kubenswrapper[4749]: I0128 18:51:30.839208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" event={"ID":"9df443ae-e4fd-4f9c-ba5f-99668ef3e451","Type":"ContainerDied","Data":"cc78b2caa66755624ff9890a416124a678e9835bb3a092701d4a9324e38e921a"} Jan 28 18:51:30 crc kubenswrapper[4749]: I0128 18:51:30.839261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" event={"ID":"9df443ae-e4fd-4f9c-ba5f-99668ef3e451","Type":"ContainerStarted","Data":"f58e095b04cba7bbca0333703b210fbb1c080870f4b7ac7565fc6b52d65c21a6"} Jan 28 18:51:30 crc kubenswrapper[4749]: I0128 18:51:30.840929 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 18:51:30 crc kubenswrapper[4749]: I0128 18:51:30.880578 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec672680-fd53-4817-84ef-86edc8d76b8d" path="/var/lib/kubelet/pods/ec672680-fd53-4817-84ef-86edc8d76b8d/volumes" Jan 28 18:51:32 crc kubenswrapper[4749]: I0128 18:51:32.855403 4749 generic.go:334] "Generic (PLEG): container finished" podID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerID="afd191135ab1f82e4c9eb193db78ff677a0084f684d01c61e5779e20b2c6bab1" exitCode=0 Jan 28 18:51:32 crc kubenswrapper[4749]: I0128 18:51:32.855448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" event={"ID":"9df443ae-e4fd-4f9c-ba5f-99668ef3e451","Type":"ContainerDied","Data":"afd191135ab1f82e4c9eb193db78ff677a0084f684d01c61e5779e20b2c6bab1"} Jan 28 18:51:33 crc kubenswrapper[4749]: I0128 18:51:33.867562 4749 generic.go:334] "Generic (PLEG): container finished" podID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerID="3d811c4a7fefcc4e3ea62b6df152527da28a9e22c15053609f025f8d5a3ff92c" exitCode=0 Jan 28 18:51:33 crc kubenswrapper[4749]: I0128 18:51:33.867689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" event={"ID":"9df443ae-e4fd-4f9c-ba5f-99668ef3e451","Type":"ContainerDied","Data":"3d811c4a7fefcc4e3ea62b6df152527da28a9e22c15053609f025f8d5a3ff92c"} Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.157691 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.244851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg69r\" (UniqueName: \"kubernetes.io/projected/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-kube-api-access-mg69r\") pod \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.244909 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-bundle\") pod \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.245006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-util\") pod \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\" (UID: \"9df443ae-e4fd-4f9c-ba5f-99668ef3e451\") " Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.246044 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-bundle" (OuterVolumeSpecName: "bundle") pod "9df443ae-e4fd-4f9c-ba5f-99668ef3e451" (UID: "9df443ae-e4fd-4f9c-ba5f-99668ef3e451"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.251267 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-kube-api-access-mg69r" (OuterVolumeSpecName: "kube-api-access-mg69r") pod "9df443ae-e4fd-4f9c-ba5f-99668ef3e451" (UID: "9df443ae-e4fd-4f9c-ba5f-99668ef3e451"). InnerVolumeSpecName "kube-api-access-mg69r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.347741 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg69r\" (UniqueName: \"kubernetes.io/projected/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-kube-api-access-mg69r\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.348115 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.704183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-util" (OuterVolumeSpecName: "util") pod "9df443ae-e4fd-4f9c-ba5f-99668ef3e451" (UID: "9df443ae-e4fd-4f9c-ba5f-99668ef3e451"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.752370 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9df443ae-e4fd-4f9c-ba5f-99668ef3e451-util\") on node \"crc\" DevicePath \"\"" Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.882090 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" event={"ID":"9df443ae-e4fd-4f9c-ba5f-99668ef3e451","Type":"ContainerDied","Data":"f58e095b04cba7bbca0333703b210fbb1c080870f4b7ac7565fc6b52d65c21a6"} Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.882129 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f58e095b04cba7bbca0333703b210fbb1c080870f4b7ac7565fc6b52d65c21a6" Jan 28 18:51:35 crc kubenswrapper[4749]: I0128 18:51:35.882179 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.308758 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-75b778c54b-888q2"] Jan 28 18:51:44 crc kubenswrapper[4749]: E0128 18:51:44.309617 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerName="pull" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.309633 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerName="pull" Jan 28 18:51:44 crc kubenswrapper[4749]: E0128 18:51:44.309655 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerName="extract" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.309663 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerName="extract" Jan 28 18:51:44 crc kubenswrapper[4749]: E0128 18:51:44.309677 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerName="util" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.309686 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerName="util" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.309851 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df443ae-e4fd-4f9c-ba5f-99668ef3e451" containerName="extract" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.310550 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.313926 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.314018 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.314236 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ft7mv" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.319835 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.319835 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.341994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75b778c54b-888q2"] Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.506799 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2368696d-a2cf-4a26-a7be-43981bf04f66-apiservice-cert\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.506869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5594\" (UniqueName: \"kubernetes.io/projected/2368696d-a2cf-4a26-a7be-43981bf04f66-kube-api-access-b5594\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.507067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2368696d-a2cf-4a26-a7be-43981bf04f66-webhook-cert\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.608428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5594\" (UniqueName: \"kubernetes.io/projected/2368696d-a2cf-4a26-a7be-43981bf04f66-kube-api-access-b5594\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.608511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2368696d-a2cf-4a26-a7be-43981bf04f66-webhook-cert\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.608622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2368696d-a2cf-4a26-a7be-43981bf04f66-apiservice-cert\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.615133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2368696d-a2cf-4a26-a7be-43981bf04f66-webhook-cert\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.627518 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2368696d-a2cf-4a26-a7be-43981bf04f66-apiservice-cert\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.637169 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc"] Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.638423 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.640395 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.640567 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-j756n" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.641127 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.664071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5594\" (UniqueName: \"kubernetes.io/projected/2368696d-a2cf-4a26-a7be-43981bf04f66-kube-api-access-b5594\") pod \"metallb-operator-controller-manager-75b778c54b-888q2\" (UID: \"2368696d-a2cf-4a26-a7be-43981bf04f66\") " pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.675545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc"] Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.812399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b4b0e51-a33b-4432-98be-be91474370c0-apiservice-cert\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.812979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fdg4\" (UniqueName: \"kubernetes.io/projected/5b4b0e51-a33b-4432-98be-be91474370c0-kube-api-access-7fdg4\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.813055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b4b0e51-a33b-4432-98be-be91474370c0-webhook-cert\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.914074 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b4b0e51-a33b-4432-98be-be91474370c0-apiservice-cert\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.914155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fdg4\" (UniqueName: \"kubernetes.io/projected/5b4b0e51-a33b-4432-98be-be91474370c0-kube-api-access-7fdg4\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.914231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b4b0e51-a33b-4432-98be-be91474370c0-webhook-cert\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.920925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b4b0e51-a33b-4432-98be-be91474370c0-apiservice-cert\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.924181 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b4b0e51-a33b-4432-98be-be91474370c0-webhook-cert\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.937140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:44 crc kubenswrapper[4749]: I0128 18:51:44.939022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fdg4\" (UniqueName: \"kubernetes.io/projected/5b4b0e51-a33b-4432-98be-be91474370c0-kube-api-access-7fdg4\") pod \"metallb-operator-webhook-server-79bbd885c9-dtmgc\" (UID: \"5b4b0e51-a33b-4432-98be-be91474370c0\") " pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:45 crc kubenswrapper[4749]: I0128 18:51:45.027844 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:45 crc kubenswrapper[4749]: I0128 18:51:45.440828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75b778c54b-888q2"] Jan 28 18:51:45 crc kubenswrapper[4749]: I0128 18:51:45.649214 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc"] Jan 28 18:51:45 crc kubenswrapper[4749]: W0128 18:51:45.658611 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b4b0e51_a33b_4432_98be_be91474370c0.slice/crio-417c6523bf988f43373fdf48e852d1abda2e5d8c127050bdf3d743b562608669 WatchSource:0}: Error finding container 417c6523bf988f43373fdf48e852d1abda2e5d8c127050bdf3d743b562608669: Status 404 returned error can't find the container with id 417c6523bf988f43373fdf48e852d1abda2e5d8c127050bdf3d743b562608669 Jan 28 18:51:45 crc kubenswrapper[4749]: I0128 18:51:45.954992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" event={"ID":"2368696d-a2cf-4a26-a7be-43981bf04f66","Type":"ContainerStarted","Data":"f2eafd9f0e5ab6c1804f859ac4e839830f8dddb2aaa536bf376cc01c9544850f"} Jan 28 18:51:45 crc kubenswrapper[4749]: I0128 18:51:45.957380 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" event={"ID":"5b4b0e51-a33b-4432-98be-be91474370c0","Type":"ContainerStarted","Data":"417c6523bf988f43373fdf48e852d1abda2e5d8c127050bdf3d743b562608669"} Jan 28 18:51:53 crc kubenswrapper[4749]: I0128 18:51:53.026190 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" event={"ID":"2368696d-a2cf-4a26-a7be-43981bf04f66","Type":"ContainerStarted","Data":"f40d216fb15175ef8ca1301a3ae27306b48195a413a60b68e3ff4a4f43c987df"} Jan 28 18:51:53 crc kubenswrapper[4749]: I0128 18:51:53.026940 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:51:53 crc kubenswrapper[4749]: I0128 18:51:53.028768 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" event={"ID":"5b4b0e51-a33b-4432-98be-be91474370c0","Type":"ContainerStarted","Data":"478a651e8c98c05c79812c5ee8858055f69d3454318b5b782eacfa03cac0b8d5"} Jan 28 18:51:53 crc kubenswrapper[4749]: I0128 18:51:53.029557 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:51:53 crc kubenswrapper[4749]: I0128 18:51:53.048446 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" podStartSLOduration=2.284020322 podStartE2EDuration="9.048429651s" podCreationTimestamp="2026-01-28 18:51:44 +0000 UTC" firstStartedPulling="2026-01-28 18:51:45.449491987 +0000 UTC m=+973.461018762" lastFinishedPulling="2026-01-28 18:51:52.213901316 +0000 UTC m=+980.225428091" observedRunningTime="2026-01-28 18:51:53.042755453 +0000 UTC m=+981.054282228" watchObservedRunningTime="2026-01-28 18:51:53.048429651 +0000 UTC m=+981.059956416" Jan 28 18:51:53 crc kubenswrapper[4749]: I0128 18:51:53.091377 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" podStartSLOduration=2.524520927 podStartE2EDuration="9.0913506s" podCreationTimestamp="2026-01-28 18:51:44 +0000 UTC" firstStartedPulling="2026-01-28 18:51:45.665280984 +0000 UTC m=+973.676807759" lastFinishedPulling="2026-01-28 18:51:52.232110657 +0000 UTC m=+980.243637432" observedRunningTime="2026-01-28 18:51:53.067868781 +0000 UTC m=+981.079395556" watchObservedRunningTime="2026-01-28 18:51:53.0913506 +0000 UTC m=+981.102877365" Jan 28 18:52:05 crc kubenswrapper[4749]: I0128 18:52:05.035425 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79bbd885c9-dtmgc" Jan 28 18:52:24 crc kubenswrapper[4749]: I0128 18:52:24.943937 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-75b778c54b-888q2" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.637684 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx"] Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.639204 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.646832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.646897 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-v4mqf" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.647099 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5xs5q"] Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.650252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8c5ee0f-766a-4f21-9243-1e926fd8ebb0-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-q6rrx\" (UID: \"e8c5ee0f-766a-4f21-9243-1e926fd8ebb0\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.650584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvp9\" (UniqueName: \"kubernetes.io/projected/e8c5ee0f-766a-4f21-9243-1e926fd8ebb0-kube-api-access-hfvp9\") pod \"frr-k8s-webhook-server-7df86c4f6c-q6rrx\" (UID: \"e8c5ee0f-766a-4f21-9243-1e926fd8ebb0\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.650351 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.651978 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.653627 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.656910 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx"] Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvp9\" (UniqueName: \"kubernetes.io/projected/e8c5ee0f-766a-4f21-9243-1e926fd8ebb0-kube-api-access-hfvp9\") pod \"frr-k8s-webhook-server-7df86c4f6c-q6rrx\" (UID: \"e8c5ee0f-766a-4f21-9243-1e926fd8ebb0\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752475 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-startup\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752523 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics-certs\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn6b5\" (UniqueName: \"kubernetes.io/projected/d597ae59-efbc-48b1-9f21-51ead30e9812-kube-api-access-gn6b5\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8c5ee0f-766a-4f21-9243-1e926fd8ebb0-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-q6rrx\" (UID: \"e8c5ee0f-766a-4f21-9243-1e926fd8ebb0\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-reloader\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-sockets\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-conf\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.752759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.788497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvp9\" (UniqueName: \"kubernetes.io/projected/e8c5ee0f-766a-4f21-9243-1e926fd8ebb0-kube-api-access-hfvp9\") pod \"frr-k8s-webhook-server-7df86c4f6c-q6rrx\" (UID: \"e8c5ee0f-766a-4f21-9243-1e926fd8ebb0\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.826393 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8szhl"] Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.827674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.832536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8c5ee0f-766a-4f21-9243-1e926fd8ebb0-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-q6rrx\" (UID: \"e8c5ee0f-766a-4f21-9243-1e926fd8ebb0\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.842417 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-wd4x6"] Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.844980 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-wd4x6"] Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.845117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.890062 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.890803 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9rq65" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.890959 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.891100 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.891234 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-metrics-certs\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metrics-certs\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-sockets\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-conf\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics-certs\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-startup\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6b5\" (UniqueName: \"kubernetes.io/projected/d597ae59-efbc-48b1-9f21-51ead30e9812-kube-api-access-gn6b5\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n82wp\" (UniqueName: \"kubernetes.io/projected/f3e87fb6-163a-4024-b3c6-f20d528fd58f-kube-api-access-n82wp\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s88zd\" (UniqueName: \"kubernetes.io/projected/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-kube-api-access-s88zd\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.892673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-cert\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:25 crc kubenswrapper[4749]: E0128 18:52:25.893412 4749 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 28 18:52:25 crc kubenswrapper[4749]: E0128 18:52:25.893462 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics-certs podName:d597ae59-efbc-48b1-9f21-51ead30e9812 nodeName:}" failed. No retries permitted until 2026-01-28 18:52:26.393448039 +0000 UTC m=+1014.404974814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics-certs") pod "frr-k8s-5xs5q" (UID: "d597ae59-efbc-48b1-9f21-51ead30e9812") : secret "frr-k8s-certs-secret" not found Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.893566 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.893699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-sockets\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.895212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-startup\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.895428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-frr-conf\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.895903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-reloader\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.896120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d597ae59-efbc-48b1-9f21-51ead30e9812-reloader\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.896169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metallb-excludel2\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.938998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6b5\" (UniqueName: \"kubernetes.io/projected/d597ae59-efbc-48b1-9f21-51ead30e9812-kube-api-access-gn6b5\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.965015 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.997725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n82wp\" (UniqueName: \"kubernetes.io/projected/f3e87fb6-163a-4024-b3c6-f20d528fd58f-kube-api-access-n82wp\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.997771 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s88zd\" (UniqueName: \"kubernetes.io/projected/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-kube-api-access-s88zd\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.997811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-cert\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.997870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metallb-excludel2\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.997933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-metrics-certs\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.997950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metrics-certs\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.997973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: E0128 18:52:25.998873 4749 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 28 18:52:25 crc kubenswrapper[4749]: I0128 18:52:25.998880 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metallb-excludel2\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:25 crc kubenswrapper[4749]: E0128 18:52:25.998918 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-metrics-certs podName:4623a06f-773f-4e09-aaf7-8dcbc11cadd4 nodeName:}" failed. No retries permitted until 2026-01-28 18:52:26.498905433 +0000 UTC m=+1014.510432208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-metrics-certs") pod "controller-6968d8fdc4-wd4x6" (UID: "4623a06f-773f-4e09-aaf7-8dcbc11cadd4") : secret "controller-certs-secret" not found Jan 28 18:52:25 crc kubenswrapper[4749]: E0128 18:52:25.999125 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 28 18:52:25 crc kubenswrapper[4749]: E0128 18:52:25.999154 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist podName:f3e87fb6-163a-4024-b3c6-f20d528fd58f nodeName:}" failed. No retries permitted until 2026-01-28 18:52:26.499145999 +0000 UTC m=+1014.510672774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist") pod "speaker-8szhl" (UID: "f3e87fb6-163a-4024-b3c6-f20d528fd58f") : secret "metallb-memberlist" not found Jan 28 18:52:25 crc kubenswrapper[4749]: E0128 18:52:25.999189 4749 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 28 18:52:25 crc kubenswrapper[4749]: E0128 18:52:25.999207 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metrics-certs podName:f3e87fb6-163a-4024-b3c6-f20d528fd58f nodeName:}" failed. No retries permitted until 2026-01-28 18:52:26.49920142 +0000 UTC m=+1014.510728195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metrics-certs") pod "speaker-8szhl" (UID: "f3e87fb6-163a-4024-b3c6-f20d528fd58f") : secret "speaker-certs-secret" not found Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.008371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-cert\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.021386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s88zd\" (UniqueName: \"kubernetes.io/projected/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-kube-api-access-s88zd\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.023911 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n82wp\" (UniqueName: \"kubernetes.io/projected/f3e87fb6-163a-4024-b3c6-f20d528fd58f-kube-api-access-n82wp\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.405158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics-certs\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.410636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d597ae59-efbc-48b1-9f21-51ead30e9812-metrics-certs\") pod \"frr-k8s-5xs5q\" (UID: \"d597ae59-efbc-48b1-9f21-51ead30e9812\") " pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.444782 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx"] Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.507034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-metrics-certs\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.507089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metrics-certs\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.507108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:26 crc kubenswrapper[4749]: E0128 18:52:26.507247 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 28 18:52:26 crc kubenswrapper[4749]: E0128 18:52:26.507304 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist podName:f3e87fb6-163a-4024-b3c6-f20d528fd58f nodeName:}" failed. No retries permitted until 2026-01-28 18:52:27.507288548 +0000 UTC m=+1015.518815323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist") pod "speaker-8szhl" (UID: "f3e87fb6-163a-4024-b3c6-f20d528fd58f") : secret "metallb-memberlist" not found Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.510937 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4623a06f-773f-4e09-aaf7-8dcbc11cadd4-metrics-certs\") pod \"controller-6968d8fdc4-wd4x6\" (UID: \"4623a06f-773f-4e09-aaf7-8dcbc11cadd4\") " pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.512160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-metrics-certs\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.542229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.577462 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:26 crc kubenswrapper[4749]: I0128 18:52:26.982291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-wd4x6"] Jan 28 18:52:27 crc kubenswrapper[4749]: I0128 18:52:27.274899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wd4x6" event={"ID":"4623a06f-773f-4e09-aaf7-8dcbc11cadd4","Type":"ContainerStarted","Data":"9dac00d0b08e3d03d91e2181ed5758fef97b1bede0c0e0759c6018c84af516de"} Jan 28 18:52:27 crc kubenswrapper[4749]: I0128 18:52:27.274945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wd4x6" event={"ID":"4623a06f-773f-4e09-aaf7-8dcbc11cadd4","Type":"ContainerStarted","Data":"9012cd69394486e853dae25dc2e5d2b2c9d933ab3efb7ee147820b905bd78b3c"} Jan 28 18:52:27 crc kubenswrapper[4749]: I0128 18:52:27.276347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerStarted","Data":"9247c5704bb6cf663f63561b176e20ec4c437ef3454c9e11d7703f5978bae5d7"} Jan 28 18:52:27 crc kubenswrapper[4749]: I0128 18:52:27.277393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" event={"ID":"e8c5ee0f-766a-4f21-9243-1e926fd8ebb0","Type":"ContainerStarted","Data":"3f5f4134a19d59227934108b3f9acce570bc90fb0d630358874dfd70dd91dc64"} Jan 28 18:52:27 crc kubenswrapper[4749]: I0128 18:52:27.521185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:27 crc kubenswrapper[4749]: I0128 18:52:27.527812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f3e87fb6-163a-4024-b3c6-f20d528fd58f-memberlist\") pod \"speaker-8szhl\" (UID: \"f3e87fb6-163a-4024-b3c6-f20d528fd58f\") " pod="metallb-system/speaker-8szhl" Jan 28 18:52:27 crc kubenswrapper[4749]: I0128 18:52:27.717558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8szhl" Jan 28 18:52:27 crc kubenswrapper[4749]: W0128 18:52:27.752619 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3e87fb6_163a_4024_b3c6_f20d528fd58f.slice/crio-43a4b88be29e04829e769bda47d35892fea724971cd53b13c9a742ab67100618 WatchSource:0}: Error finding container 43a4b88be29e04829e769bda47d35892fea724971cd53b13c9a742ab67100618: Status 404 returned error can't find the container with id 43a4b88be29e04829e769bda47d35892fea724971cd53b13c9a742ab67100618 Jan 28 18:52:28 crc kubenswrapper[4749]: I0128 18:52:28.324273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wd4x6" event={"ID":"4623a06f-773f-4e09-aaf7-8dcbc11cadd4","Type":"ContainerStarted","Data":"40e86af8d7f02685235810e8dc1b8c4b1d331bc57a8d4f2154925c630400924d"} Jan 28 18:52:28 crc kubenswrapper[4749]: I0128 18:52:28.325873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:28 crc kubenswrapper[4749]: I0128 18:52:28.330190 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8szhl" event={"ID":"f3e87fb6-163a-4024-b3c6-f20d528fd58f","Type":"ContainerStarted","Data":"c6f372b4e7558b844fee3937df70848290d9aa15b1468a57dc2ef42cc6389b30"} Jan 28 18:52:28 crc kubenswrapper[4749]: I0128 18:52:28.330230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8szhl" event={"ID":"f3e87fb6-163a-4024-b3c6-f20d528fd58f","Type":"ContainerStarted","Data":"43a4b88be29e04829e769bda47d35892fea724971cd53b13c9a742ab67100618"} Jan 28 18:52:28 crc kubenswrapper[4749]: I0128 18:52:28.354008 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-wd4x6" podStartSLOduration=3.353993602 podStartE2EDuration="3.353993602s" podCreationTimestamp="2026-01-28 18:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:52:28.352408813 +0000 UTC m=+1016.363935598" watchObservedRunningTime="2026-01-28 18:52:28.353993602 +0000 UTC m=+1016.365520367" Jan 28 18:52:29 crc kubenswrapper[4749]: I0128 18:52:29.352643 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8szhl" event={"ID":"f3e87fb6-163a-4024-b3c6-f20d528fd58f","Type":"ContainerStarted","Data":"3d7455ad642b4c65240a1c5f0fb3063760ff2cb5a354ab6864615d2ecf6e3676"} Jan 28 18:52:29 crc kubenswrapper[4749]: I0128 18:52:29.352709 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8szhl" Jan 28 18:52:29 crc kubenswrapper[4749]: I0128 18:52:29.377384 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8szhl" podStartSLOduration=4.377365061 podStartE2EDuration="4.377365061s" podCreationTimestamp="2026-01-28 18:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:52:29.370067885 +0000 UTC m=+1017.381594660" watchObservedRunningTime="2026-01-28 18:52:29.377365061 +0000 UTC m=+1017.388891836" Jan 28 18:52:39 crc kubenswrapper[4749]: I0128 18:52:39.430611 4749 generic.go:334] "Generic (PLEG): container finished" podID="d597ae59-efbc-48b1-9f21-51ead30e9812" containerID="5f4ddf1005c3a9de2cbd98fb450f5b69f1eece677919c2d9749d930a344a23a2" exitCode=0 Jan 28 18:52:39 crc kubenswrapper[4749]: I0128 18:52:39.430688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerDied","Data":"5f4ddf1005c3a9de2cbd98fb450f5b69f1eece677919c2d9749d930a344a23a2"} Jan 28 18:52:39 crc kubenswrapper[4749]: I0128 18:52:39.432891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" event={"ID":"e8c5ee0f-766a-4f21-9243-1e926fd8ebb0","Type":"ContainerStarted","Data":"230eae3a3c89e2fce97d835924b601c0b339d880c10d62f24f5190404ddf6275"} Jan 28 18:52:39 crc kubenswrapper[4749]: I0128 18:52:39.433031 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:39 crc kubenswrapper[4749]: I0128 18:52:39.498223 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" podStartSLOduration=2.767748712 podStartE2EDuration="14.498202774s" podCreationTimestamp="2026-01-28 18:52:25 +0000 UTC" firstStartedPulling="2026-01-28 18:52:26.453012414 +0000 UTC m=+1014.464539189" lastFinishedPulling="2026-01-28 18:52:38.183466466 +0000 UTC m=+1026.194993251" observedRunningTime="2026-01-28 18:52:39.498152493 +0000 UTC m=+1027.509679268" watchObservedRunningTime="2026-01-28 18:52:39.498202774 +0000 UTC m=+1027.509729549" Jan 28 18:52:40 crc kubenswrapper[4749]: I0128 18:52:40.441737 4749 generic.go:334] "Generic (PLEG): container finished" podID="d597ae59-efbc-48b1-9f21-51ead30e9812" containerID="92e06c5280334d715a0b5743c5dadf91632caa58d746aecafc9d3573f3ed3bf7" exitCode=0 Jan 28 18:52:40 crc kubenswrapper[4749]: I0128 18:52:40.441867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerDied","Data":"92e06c5280334d715a0b5743c5dadf91632caa58d746aecafc9d3573f3ed3bf7"} Jan 28 18:52:41 crc kubenswrapper[4749]: I0128 18:52:41.450577 4749 generic.go:334] "Generic (PLEG): container finished" podID="d597ae59-efbc-48b1-9f21-51ead30e9812" containerID="2fb87305483cb4bcb432e9f28d1232b9c777b28b3614df983252949948572c3f" exitCode=0 Jan 28 18:52:41 crc kubenswrapper[4749]: I0128 18:52:41.450674 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerDied","Data":"2fb87305483cb4bcb432e9f28d1232b9c777b28b3614df983252949948572c3f"} Jan 28 18:52:42 crc kubenswrapper[4749]: I0128 18:52:42.461394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerStarted","Data":"d95e89bc1de1e49b6de8fd6f54f5ac812839ab74a3c5a4dec3d9fc56c4ce3ec9"} Jan 28 18:52:42 crc kubenswrapper[4749]: I0128 18:52:42.461636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerStarted","Data":"9b42db605fd08ad7c7fdfe41362277ce867d1f524e1f3da9143b76d14bc18294"} Jan 28 18:52:42 crc kubenswrapper[4749]: I0128 18:52:42.461648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerStarted","Data":"8e5b71c5836e5a327f78113efdf39152ab42dbca4c32246fefc77c7b0c76d6bb"} Jan 28 18:52:46 crc kubenswrapper[4749]: I0128 18:52:46.547154 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-wd4x6" Jan 28 18:52:47 crc kubenswrapper[4749]: I0128 18:52:47.723872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8szhl" Jan 28 18:52:48 crc kubenswrapper[4749]: I0128 18:52:48.508831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerStarted","Data":"3009cb09ce374c447f92ff85b2dbb297a27ccaea65fde7e1dcebd968a5cdb96e"} Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.498982 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-g8jzq"] Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.502215 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g8jzq" Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.506406 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cj5lg" Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.506619 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.508958 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.561302 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g8jzq"] Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.594386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerStarted","Data":"f1724cc072088f61b94f44f205ef410b59307427f8ae6d1692a15e432ba0760b"} Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.628911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8rts\" (UniqueName: \"kubernetes.io/projected/99969b89-3278-412e-a77d-4ea0bfdd34cc-kube-api-access-m8rts\") pod \"openstack-operator-index-g8jzq\" (UID: \"99969b89-3278-412e-a77d-4ea0bfdd34cc\") " pod="openstack-operators/openstack-operator-index-g8jzq" Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.731617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8rts\" (UniqueName: \"kubernetes.io/projected/99969b89-3278-412e-a77d-4ea0bfdd34cc-kube-api-access-m8rts\") pod \"openstack-operator-index-g8jzq\" (UID: \"99969b89-3278-412e-a77d-4ea0bfdd34cc\") " pod="openstack-operators/openstack-operator-index-g8jzq" Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.750450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8rts\" (UniqueName: \"kubernetes.io/projected/99969b89-3278-412e-a77d-4ea0bfdd34cc-kube-api-access-m8rts\") pod \"openstack-operator-index-g8jzq\" (UID: \"99969b89-3278-412e-a77d-4ea0bfdd34cc\") " pod="openstack-operators/openstack-operator-index-g8jzq" Jan 28 18:52:50 crc kubenswrapper[4749]: I0128 18:52:50.847502 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g8jzq" Jan 28 18:52:51 crc kubenswrapper[4749]: I0128 18:52:51.250593 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g8jzq"] Jan 28 18:52:51 crc kubenswrapper[4749]: W0128 18:52:51.252995 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99969b89_3278_412e_a77d_4ea0bfdd34cc.slice/crio-258e5bb991d301b7b0a6ebc4b3fce500671a94f769987658df89ecb0a2e8603c WatchSource:0}: Error finding container 258e5bb991d301b7b0a6ebc4b3fce500671a94f769987658df89ecb0a2e8603c: Status 404 returned error can't find the container with id 258e5bb991d301b7b0a6ebc4b3fce500671a94f769987658df89ecb0a2e8603c Jan 28 18:52:51 crc kubenswrapper[4749]: I0128 18:52:51.607886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5xs5q" event={"ID":"d597ae59-efbc-48b1-9f21-51ead30e9812","Type":"ContainerStarted","Data":"9e589e755cf1b1d2073a7ff1aa1a262490ce474f5f6aeb69ed3fd2c753196b53"} Jan 28 18:52:51 crc kubenswrapper[4749]: I0128 18:52:51.608236 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:51 crc kubenswrapper[4749]: I0128 18:52:51.609592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g8jzq" event={"ID":"99969b89-3278-412e-a77d-4ea0bfdd34cc","Type":"ContainerStarted","Data":"258e5bb991d301b7b0a6ebc4b3fce500671a94f769987658df89ecb0a2e8603c"} Jan 28 18:52:51 crc kubenswrapper[4749]: I0128 18:52:51.610087 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:51 crc kubenswrapper[4749]: I0128 18:52:51.636529 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5xs5q" podStartSLOduration=15.261967789 podStartE2EDuration="26.63650843s" podCreationTimestamp="2026-01-28 18:52:25 +0000 UTC" firstStartedPulling="2026-01-28 18:52:26.826274345 +0000 UTC m=+1014.837801120" lastFinishedPulling="2026-01-28 18:52:38.200814986 +0000 UTC m=+1026.212341761" observedRunningTime="2026-01-28 18:52:51.631520477 +0000 UTC m=+1039.643047282" watchObservedRunningTime="2026-01-28 18:52:51.63650843 +0000 UTC m=+1039.648035225" Jan 28 18:52:53 crc kubenswrapper[4749]: I0128 18:52:53.764382 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-g8jzq"] Jan 28 18:52:54 crc kubenswrapper[4749]: I0128 18:52:54.275515 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tjghg"] Jan 28 18:52:54 crc kubenswrapper[4749]: I0128 18:52:54.276853 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:52:54 crc kubenswrapper[4749]: I0128 18:52:54.286252 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tjghg"] Jan 28 18:52:54 crc kubenswrapper[4749]: I0128 18:52:54.295909 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxs5z\" (UniqueName: \"kubernetes.io/projected/0c39e330-6c54-46c8-a4f7-529871b845db-kube-api-access-jxs5z\") pod \"openstack-operator-index-tjghg\" (UID: \"0c39e330-6c54-46c8-a4f7-529871b845db\") " pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:52:54 crc kubenswrapper[4749]: I0128 18:52:54.396685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxs5z\" (UniqueName: \"kubernetes.io/projected/0c39e330-6c54-46c8-a4f7-529871b845db-kube-api-access-jxs5z\") pod \"openstack-operator-index-tjghg\" (UID: \"0c39e330-6c54-46c8-a4f7-529871b845db\") " pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:52:54 crc kubenswrapper[4749]: I0128 18:52:54.418569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxs5z\" (UniqueName: \"kubernetes.io/projected/0c39e330-6c54-46c8-a4f7-529871b845db-kube-api-access-jxs5z\") pod \"openstack-operator-index-tjghg\" (UID: \"0c39e330-6c54-46c8-a4f7-529871b845db\") " pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:52:54 crc kubenswrapper[4749]: I0128 18:52:54.602913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:52:55 crc kubenswrapper[4749]: I0128 18:52:55.621066 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tjghg"] Jan 28 18:52:55 crc kubenswrapper[4749]: I0128 18:52:55.635994 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tjghg" event={"ID":"0c39e330-6c54-46c8-a4f7-529871b845db","Type":"ContainerStarted","Data":"927e18ff8046bf2acc0f58ffa6f7afead05d12c431386ca9be8b799da2f6519c"} Jan 28 18:52:55 crc kubenswrapper[4749]: I0128 18:52:55.637305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g8jzq" event={"ID":"99969b89-3278-412e-a77d-4ea0bfdd34cc","Type":"ContainerStarted","Data":"dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20"} Jan 28 18:52:55 crc kubenswrapper[4749]: I0128 18:52:55.637459 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-g8jzq" podUID="99969b89-3278-412e-a77d-4ea0bfdd34cc" containerName="registry-server" containerID="cri-o://dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20" gracePeriod=2 Jan 28 18:52:55 crc kubenswrapper[4749]: I0128 18:52:55.668568 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-g8jzq" podStartSLOduration=1.659522255 podStartE2EDuration="5.668545558s" podCreationTimestamp="2026-01-28 18:52:50 +0000 UTC" firstStartedPulling="2026-01-28 18:52:51.255513524 +0000 UTC m=+1039.267040309" lastFinishedPulling="2026-01-28 18:52:55.264536837 +0000 UTC m=+1043.276063612" observedRunningTime="2026-01-28 18:52:55.65805309 +0000 UTC m=+1043.669579865" watchObservedRunningTime="2026-01-28 18:52:55.668545558 +0000 UTC m=+1043.680072353" Jan 28 18:52:55 crc kubenswrapper[4749]: I0128 18:52:55.973167 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-q6rrx" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.132123 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g8jzq" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.326584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8rts\" (UniqueName: \"kubernetes.io/projected/99969b89-3278-412e-a77d-4ea0bfdd34cc-kube-api-access-m8rts\") pod \"99969b89-3278-412e-a77d-4ea0bfdd34cc\" (UID: \"99969b89-3278-412e-a77d-4ea0bfdd34cc\") " Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.331860 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99969b89-3278-412e-a77d-4ea0bfdd34cc-kube-api-access-m8rts" (OuterVolumeSpecName: "kube-api-access-m8rts") pod "99969b89-3278-412e-a77d-4ea0bfdd34cc" (UID: "99969b89-3278-412e-a77d-4ea0bfdd34cc"). InnerVolumeSpecName "kube-api-access-m8rts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.428958 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8rts\" (UniqueName: \"kubernetes.io/projected/99969b89-3278-412e-a77d-4ea0bfdd34cc-kube-api-access-m8rts\") on node \"crc\" DevicePath \"\"" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.578391 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.615777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5xs5q" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.652215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tjghg" event={"ID":"0c39e330-6c54-46c8-a4f7-529871b845db","Type":"ContainerStarted","Data":"6a46cea0ba904e7f3358f4085bcb33e858ccad164b048a36ed02b46c997700dc"} Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.653497 4749 generic.go:334] "Generic (PLEG): container finished" podID="99969b89-3278-412e-a77d-4ea0bfdd34cc" containerID="dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20" exitCode=0 Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.654280 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g8jzq" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.657784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g8jzq" event={"ID":"99969b89-3278-412e-a77d-4ea0bfdd34cc","Type":"ContainerDied","Data":"dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20"} Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.657858 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g8jzq" event={"ID":"99969b89-3278-412e-a77d-4ea0bfdd34cc","Type":"ContainerDied","Data":"258e5bb991d301b7b0a6ebc4b3fce500671a94f769987658df89ecb0a2e8603c"} Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.657879 4749 scope.go:117] "RemoveContainer" containerID="dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.672205 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tjghg" podStartSLOduration=2.609439347 podStartE2EDuration="2.672189747s" podCreationTimestamp="2026-01-28 18:52:54 +0000 UTC" firstStartedPulling="2026-01-28 18:52:55.624312762 +0000 UTC m=+1043.635839537" lastFinishedPulling="2026-01-28 18:52:55.687063162 +0000 UTC m=+1043.698589937" observedRunningTime="2026-01-28 18:52:56.6690676 +0000 UTC m=+1044.680594395" watchObservedRunningTime="2026-01-28 18:52:56.672189747 +0000 UTC m=+1044.683716522" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.690699 4749 scope.go:117] "RemoveContainer" containerID="dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20" Jan 28 18:52:56 crc kubenswrapper[4749]: E0128 18:52:56.691624 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20\": container with ID starting with dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20 not found: ID does not exist" containerID="dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.691698 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20"} err="failed to get container status \"dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20\": rpc error: code = NotFound desc = could not find container \"dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20\": container with ID starting with dcfc28c9f13b81fb081f77adfc069cc62748ebdf57fc5e35d7e246c322959c20 not found: ID does not exist" Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.722498 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-g8jzq"] Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.729656 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-g8jzq"] Jan 28 18:52:56 crc kubenswrapper[4749]: I0128 18:52:56.880441 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99969b89-3278-412e-a77d-4ea0bfdd34cc" path="/var/lib/kubelet/pods/99969b89-3278-412e-a77d-4ea0bfdd34cc/volumes" Jan 28 18:53:04 crc kubenswrapper[4749]: I0128 18:53:04.603798 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:53:04 crc kubenswrapper[4749]: I0128 18:53:04.604479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:53:04 crc kubenswrapper[4749]: I0128 18:53:04.630440 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:53:04 crc kubenswrapper[4749]: I0128 18:53:04.737540 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tjghg" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.111560 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr"] Jan 28 18:53:07 crc kubenswrapper[4749]: E0128 18:53:07.112381 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99969b89-3278-412e-a77d-4ea0bfdd34cc" containerName="registry-server" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.112399 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99969b89-3278-412e-a77d-4ea0bfdd34cc" containerName="registry-server" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.112575 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99969b89-3278-412e-a77d-4ea0bfdd34cc" containerName="registry-server" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.113760 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.115970 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-7ddzg" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.121043 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr"] Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.121456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88p8p\" (UniqueName: \"kubernetes.io/projected/608c8370-fa8f-40c3-8fe2-0ee96449e671-kube-api-access-88p8p\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.121506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-bundle\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.121844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-util\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.224310 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88p8p\" (UniqueName: \"kubernetes.io/projected/608c8370-fa8f-40c3-8fe2-0ee96449e671-kube-api-access-88p8p\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.224394 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-bundle\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.224487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-util\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.225183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-bundle\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.225287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-util\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.244017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88p8p\" (UniqueName: \"kubernetes.io/projected/608c8370-fa8f-40c3-8fe2-0ee96449e671-kube-api-access-88p8p\") pod \"5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.433808 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:07 crc kubenswrapper[4749]: I0128 18:53:07.911169 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr"] Jan 28 18:53:08 crc kubenswrapper[4749]: I0128 18:53:08.741006 4749 generic.go:334] "Generic (PLEG): container finished" podID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerID="d1a7682fc752b1d307279626654f15b532d3161c4112c48889fbd6e6cb47bdb9" exitCode=0 Jan 28 18:53:08 crc kubenswrapper[4749]: I0128 18:53:08.742519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" event={"ID":"608c8370-fa8f-40c3-8fe2-0ee96449e671","Type":"ContainerDied","Data":"d1a7682fc752b1d307279626654f15b532d3161c4112c48889fbd6e6cb47bdb9"} Jan 28 18:53:08 crc kubenswrapper[4749]: I0128 18:53:08.742594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" event={"ID":"608c8370-fa8f-40c3-8fe2-0ee96449e671","Type":"ContainerStarted","Data":"6ff2117c597a6edd0a1c7b5ffef5e6cbc6394701393e43ff58da40cfcb74492f"} Jan 28 18:53:11 crc kubenswrapper[4749]: I0128 18:53:11.766260 4749 generic.go:334] "Generic (PLEG): container finished" podID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerID="9e742ea508896aa26abea6d173588c5b0d714531a7011c8632c1ff53fd999645" exitCode=0 Jan 28 18:53:11 crc kubenswrapper[4749]: I0128 18:53:11.766373 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" event={"ID":"608c8370-fa8f-40c3-8fe2-0ee96449e671","Type":"ContainerDied","Data":"9e742ea508896aa26abea6d173588c5b0d714531a7011c8632c1ff53fd999645"} Jan 28 18:53:12 crc kubenswrapper[4749]: W0128 18:53:12.131561 4749 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod608c8370_fa8f_40c3_8fe2_0ee96449e671.slice/crio-2e7ce5e9c81a6f354904de721eeaf1d4d0339cca3022c60a553fd5ae8abb4115.scope/cpu.max": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod608c8370_fa8f_40c3_8fe2_0ee96449e671.slice/crio-2e7ce5e9c81a6f354904de721eeaf1d4d0339cca3022c60a553fd5ae8abb4115.scope/cpu.max: no such device Jan 28 18:53:12 crc kubenswrapper[4749]: I0128 18:53:12.775954 4749 generic.go:334] "Generic (PLEG): container finished" podID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerID="2e7ce5e9c81a6f354904de721eeaf1d4d0339cca3022c60a553fd5ae8abb4115" exitCode=0 Jan 28 18:53:12 crc kubenswrapper[4749]: I0128 18:53:12.776000 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" event={"ID":"608c8370-fa8f-40c3-8fe2-0ee96449e671","Type":"ContainerDied","Data":"2e7ce5e9c81a6f354904de721eeaf1d4d0339cca3022c60a553fd5ae8abb4115"} Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.078605 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.244797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-bundle\") pod \"608c8370-fa8f-40c3-8fe2-0ee96449e671\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.245016 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88p8p\" (UniqueName: \"kubernetes.io/projected/608c8370-fa8f-40c3-8fe2-0ee96449e671-kube-api-access-88p8p\") pod \"608c8370-fa8f-40c3-8fe2-0ee96449e671\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.245091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-util\") pod \"608c8370-fa8f-40c3-8fe2-0ee96449e671\" (UID: \"608c8370-fa8f-40c3-8fe2-0ee96449e671\") " Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.245647 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-bundle" (OuterVolumeSpecName: "bundle") pod "608c8370-fa8f-40c3-8fe2-0ee96449e671" (UID: "608c8370-fa8f-40c3-8fe2-0ee96449e671"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.254722 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/608c8370-fa8f-40c3-8fe2-0ee96449e671-kube-api-access-88p8p" (OuterVolumeSpecName: "kube-api-access-88p8p") pod "608c8370-fa8f-40c3-8fe2-0ee96449e671" (UID: "608c8370-fa8f-40c3-8fe2-0ee96449e671"). InnerVolumeSpecName "kube-api-access-88p8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.254914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-util" (OuterVolumeSpecName: "util") pod "608c8370-fa8f-40c3-8fe2-0ee96449e671" (UID: "608c8370-fa8f-40c3-8fe2-0ee96449e671"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.347755 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88p8p\" (UniqueName: \"kubernetes.io/projected/608c8370-fa8f-40c3-8fe2-0ee96449e671-kube-api-access-88p8p\") on node \"crc\" DevicePath \"\"" Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.347791 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-util\") on node \"crc\" DevicePath \"\"" Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.347801 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c8370-fa8f-40c3-8fe2-0ee96449e671-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.797210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" event={"ID":"608c8370-fa8f-40c3-8fe2-0ee96449e671","Type":"ContainerDied","Data":"6ff2117c597a6edd0a1c7b5ffef5e6cbc6394701393e43ff58da40cfcb74492f"} Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.797609 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ff2117c597a6edd0a1c7b5ffef5e6cbc6394701393e43ff58da40cfcb74492f" Jan 28 18:53:14 crc kubenswrapper[4749]: I0128 18:53:14.797378 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.234306 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn"] Jan 28 18:53:19 crc kubenswrapper[4749]: E0128 18:53:19.235084 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerName="util" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.235101 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerName="util" Jan 28 18:53:19 crc kubenswrapper[4749]: E0128 18:53:19.235141 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerName="extract" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.235149 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerName="extract" Jan 28 18:53:19 crc kubenswrapper[4749]: E0128 18:53:19.235165 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerName="pull" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.235175 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerName="pull" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.235381 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="608c8370-fa8f-40c3-8fe2-0ee96449e671" containerName="extract" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.236046 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.245070 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-kn8xw" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.284810 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn"] Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.332394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dns7\" (UniqueName: \"kubernetes.io/projected/147f1b91-3404-40b5-9a8b-7799ab71fadf-kube-api-access-7dns7\") pod \"openstack-operator-controller-init-858cbdb9cd-xhmkn\" (UID: \"147f1b91-3404-40b5-9a8b-7799ab71fadf\") " pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.433954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dns7\" (UniqueName: \"kubernetes.io/projected/147f1b91-3404-40b5-9a8b-7799ab71fadf-kube-api-access-7dns7\") pod \"openstack-operator-controller-init-858cbdb9cd-xhmkn\" (UID: \"147f1b91-3404-40b5-9a8b-7799ab71fadf\") " pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.467595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dns7\" (UniqueName: \"kubernetes.io/projected/147f1b91-3404-40b5-9a8b-7799ab71fadf-kube-api-access-7dns7\") pod \"openstack-operator-controller-init-858cbdb9cd-xhmkn\" (UID: \"147f1b91-3404-40b5-9a8b-7799ab71fadf\") " pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.553681 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" Jan 28 18:53:19 crc kubenswrapper[4749]: I0128 18:53:19.986980 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn"] Jan 28 18:53:20 crc kubenswrapper[4749]: I0128 18:53:20.843344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" event={"ID":"147f1b91-3404-40b5-9a8b-7799ab71fadf","Type":"ContainerStarted","Data":"d88aa5862b392779f17c45fc75782be1bb53fed40d3e869303a7d0f92a7ae037"} Jan 28 18:53:27 crc kubenswrapper[4749]: I0128 18:53:27.467488 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:53:27 crc kubenswrapper[4749]: I0128 18:53:27.467986 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:53:31 crc kubenswrapper[4749]: I0128 18:53:31.925581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" event={"ID":"147f1b91-3404-40b5-9a8b-7799ab71fadf","Type":"ContainerStarted","Data":"509e83a77cbba4f89c528c2ce8e1e55f383a4127c9336b884cd241b6695222fa"} Jan 28 18:53:31 crc kubenswrapper[4749]: I0128 18:53:31.926224 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" Jan 28 18:53:31 crc kubenswrapper[4749]: I0128 18:53:31.956150 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" podStartSLOduration=1.68262472 podStartE2EDuration="12.956129101s" podCreationTimestamp="2026-01-28 18:53:19 +0000 UTC" firstStartedPulling="2026-01-28 18:53:20.004069985 +0000 UTC m=+1068.015596760" lastFinishedPulling="2026-01-28 18:53:31.277574366 +0000 UTC m=+1079.289101141" observedRunningTime="2026-01-28 18:53:31.94875022 +0000 UTC m=+1079.960277005" watchObservedRunningTime="2026-01-28 18:53:31.956129101 +0000 UTC m=+1079.967655876" Jan 28 18:53:39 crc kubenswrapper[4749]: I0128 18:53:39.557997 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-858cbdb9cd-xhmkn" Jan 28 18:53:57 crc kubenswrapper[4749]: I0128 18:53:57.467550 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:53:57 crc kubenswrapper[4749]: I0128 18:53:57.468144 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.015966 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.017349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.018897 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-nd96k" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.022781 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.030936 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.033276 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.047164 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.058871 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8hscx" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.059039 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.060447 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.066778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-75zfv" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.087666 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.088870 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.090931 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7kfsp" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.097489 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.128965 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.140161 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.141466 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.150013 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4pznm" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.170651 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.171965 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.180744 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-828dt" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.193254 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.211661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz8zh\" (UniqueName: \"kubernetes.io/projected/8e193a26-a0a4-48e1-a3bf-13b52f809e3e-kube-api-access-fz8zh\") pod \"glance-operator-controller-manager-6db5dbd896-q7r57\" (UID: \"8e193a26-a0a4-48e1-a3bf-13b52f809e3e\") " pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.211724 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25tkx\" (UniqueName: \"kubernetes.io/projected/b99425c5-bab7-48bb-a341-e1c160eac631-kube-api-access-25tkx\") pod \"designate-operator-controller-manager-66dfbd6f5d-n667g\" (UID: \"b99425c5-bab7-48bb-a341-e1c160eac631\") " pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.211769 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpf75\" (UniqueName: \"kubernetes.io/projected/b553f796-28d7-455e-92bb-05bbc30a9a27-kube-api-access-kpf75\") pod \"cinder-operator-controller-manager-f6487bd57-5ztsh\" (UID: \"b553f796-28d7-455e-92bb-05bbc30a9a27\") " pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.211800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhcvl\" (UniqueName: \"kubernetes.io/projected/b606ed73-e992-4755-ac52-4ace6b8b553c-kube-api-access-lhcvl\") pod \"barbican-operator-controller-manager-6bc7f4f4cf-7rcdb\" (UID: \"b606ed73-e992-4755-ac52-4ace6b8b553c\") " pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.218467 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.219552 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.223620 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zztsd" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.229425 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.238446 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-958664b5-2x98l"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.239847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.243739 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xtfhq" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.270770 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.279401 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.322166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpf75\" (UniqueName: \"kubernetes.io/projected/b553f796-28d7-455e-92bb-05bbc30a9a27-kube-api-access-kpf75\") pod \"cinder-operator-controller-manager-f6487bd57-5ztsh\" (UID: \"b553f796-28d7-455e-92bb-05bbc30a9a27\") " pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.322259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhcvl\" (UniqueName: \"kubernetes.io/projected/b606ed73-e992-4755-ac52-4ace6b8b553c-kube-api-access-lhcvl\") pod \"barbican-operator-controller-manager-6bc7f4f4cf-7rcdb\" (UID: \"b606ed73-e992-4755-ac52-4ace6b8b553c\") " pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.322355 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhz2m\" (UniqueName: \"kubernetes.io/projected/ce60ace7-ac44-45a5-9422-dade4b147417-kube-api-access-mhz2m\") pod \"heat-operator-controller-manager-587c6bfdcf-zbk86\" (UID: \"ce60ace7-ac44-45a5-9422-dade4b147417\") " pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.322496 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz8zh\" (UniqueName: \"kubernetes.io/projected/8e193a26-a0a4-48e1-a3bf-13b52f809e3e-kube-api-access-fz8zh\") pod \"glance-operator-controller-manager-6db5dbd896-q7r57\" (UID: \"8e193a26-a0a4-48e1-a3bf-13b52f809e3e\") " pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.322574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25tkx\" (UniqueName: \"kubernetes.io/projected/b99425c5-bab7-48bb-a341-e1c160eac631-kube-api-access-25tkx\") pod \"designate-operator-controller-manager-66dfbd6f5d-n667g\" (UID: \"b99425c5-bab7-48bb-a341-e1c160eac631\") " pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.322639 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8vxc\" (UniqueName: \"kubernetes.io/projected/aeea4bd2-a2a5-451b-9a40-02881d5901a0-kube-api-access-w8vxc\") pod \"horizon-operator-controller-manager-5fb775575f-g9cw9\" (UID: \"aeea4bd2-a2a5-451b-9a40-02881d5901a0\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.322835 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.328038 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.334516 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tw54q" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.361311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-958664b5-2x98l"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.385573 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpf75\" (UniqueName: \"kubernetes.io/projected/b553f796-28d7-455e-92bb-05bbc30a9a27-kube-api-access-kpf75\") pod \"cinder-operator-controller-manager-f6487bd57-5ztsh\" (UID: \"b553f796-28d7-455e-92bb-05bbc30a9a27\") " pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.385620 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz8zh\" (UniqueName: \"kubernetes.io/projected/8e193a26-a0a4-48e1-a3bf-13b52f809e3e-kube-api-access-fz8zh\") pod \"glance-operator-controller-manager-6db5dbd896-q7r57\" (UID: \"8e193a26-a0a4-48e1-a3bf-13b52f809e3e\") " pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.386245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhcvl\" (UniqueName: \"kubernetes.io/projected/b606ed73-e992-4755-ac52-4ace6b8b553c-kube-api-access-lhcvl\") pod \"barbican-operator-controller-manager-6bc7f4f4cf-7rcdb\" (UID: \"b606ed73-e992-4755-ac52-4ace6b8b553c\") " pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.386562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25tkx\" (UniqueName: \"kubernetes.io/projected/b99425c5-bab7-48bb-a341-e1c160eac631-kube-api-access-25tkx\") pod \"designate-operator-controller-manager-66dfbd6f5d-n667g\" (UID: \"b99425c5-bab7-48bb-a341-e1c160eac631\") " pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.406169 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.413260 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.415759 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-765668569f-p2dr2"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.418179 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.424366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8vxc\" (UniqueName: \"kubernetes.io/projected/aeea4bd2-a2a5-451b-9a40-02881d5901a0-kube-api-access-w8vxc\") pod \"horizon-operator-controller-manager-5fb775575f-g9cw9\" (UID: \"aeea4bd2-a2a5-451b-9a40-02881d5901a0\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.424458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhz2m\" (UniqueName: \"kubernetes.io/projected/ce60ace7-ac44-45a5-9422-dade4b147417-kube-api-access-mhz2m\") pod \"heat-operator-controller-manager-587c6bfdcf-zbk86\" (UID: \"ce60ace7-ac44-45a5-9422-dade4b147417\") " pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.424495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.424521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkggv\" (UniqueName: \"kubernetes.io/projected/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-kube-api-access-bkggv\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.424564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgsqx\" (UniqueName: \"kubernetes.io/projected/9749144e-8c20-470f-97f1-c450d9520c07-kube-api-access-dgsqx\") pod \"ironic-operator-controller-manager-958664b5-2x98l\" (UID: \"9749144e-8c20-470f-97f1-c450d9520c07\") " pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.430960 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-p5bqg" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.431128 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-765668569f-p2dr2"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.453828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.461779 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8vxc\" (UniqueName: \"kubernetes.io/projected/aeea4bd2-a2a5-451b-9a40-02881d5901a0-kube-api-access-w8vxc\") pod \"horizon-operator-controller-manager-5fb775575f-g9cw9\" (UID: \"aeea4bd2-a2a5-451b-9a40-02881d5901a0\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.463639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhz2m\" (UniqueName: \"kubernetes.io/projected/ce60ace7-ac44-45a5-9422-dade4b147417-kube-api-access-mhz2m\") pod \"heat-operator-controller-manager-587c6bfdcf-zbk86\" (UID: \"ce60ace7-ac44-45a5-9422-dade4b147417\") " pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.474431 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.475607 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.478664 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zpkt7" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.489509 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.498897 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.500389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.501807 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.502375 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-nvgvb" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.518187 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.526023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsqx\" (UniqueName: \"kubernetes.io/projected/9749144e-8c20-470f-97f1-c450d9520c07-kube-api-access-dgsqx\") pod \"ironic-operator-controller-manager-958664b5-2x98l\" (UID: \"9749144e-8c20-470f-97f1-c450d9520c07\") " pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.526114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bprg2\" (UniqueName: \"kubernetes.io/projected/d08f8b17-02c1-4000-a5a5-0e53b209472c-kube-api-access-bprg2\") pod \"keystone-operator-controller-manager-6978b79747-mgpfr\" (UID: \"d08f8b17-02c1-4000-a5a5-0e53b209472c\") " pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.526184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.526211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvhxf\" (UniqueName: \"kubernetes.io/projected/b3ba0ea4-04ca-478e-ae2c-ebcbf3353362-kube-api-access-qvhxf\") pod \"manila-operator-controller-manager-765668569f-p2dr2\" (UID: \"b3ba0ea4-04ca-478e-ae2c-ebcbf3353362\") " pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.526240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkggv\" (UniqueName: \"kubernetes.io/projected/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-kube-api-access-bkggv\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:53:59 crc kubenswrapper[4749]: E0128 18:53:59.526552 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 18:53:59 crc kubenswrapper[4749]: E0128 18:53:59.526619 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert podName:76e8a2e5-b9cd-4114-af08-d767fb2ae45d nodeName:}" failed. No retries permitted until 2026-01-28 18:54:00.026601303 +0000 UTC m=+1108.038128078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert") pod "infra-operator-controller-manager-79955696d6-4t7fn" (UID: "76e8a2e5-b9cd-4114-af08-d767fb2ae45d") : secret "infra-operator-webhook-server-cert" not found Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.529823 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.541510 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.542730 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.545159 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-f764x" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.549130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgsqx\" (UniqueName: \"kubernetes.io/projected/9749144e-8c20-470f-97f1-c450d9520c07-kube-api-access-dgsqx\") pod \"ironic-operator-controller-manager-958664b5-2x98l\" (UID: \"9749144e-8c20-470f-97f1-c450d9520c07\") " pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.553432 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.553935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkggv\" (UniqueName: \"kubernetes.io/projected/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-kube-api-access-bkggv\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.557251 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.560006 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-svm5n" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.561524 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.563284 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.568007 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.604590 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.606058 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.616785 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.618018 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xl79p" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.620922 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.627997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xphvp\" (UniqueName: \"kubernetes.io/projected/ef2c3378-a1f3-4366-be22-66e12e71dcb2-kube-api-access-xphvp\") pod \"mariadb-operator-controller-manager-67bf948998-npbds\" (UID: \"ef2c3378-a1f3-4366-be22-66e12e71dcb2\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.628092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bprg2\" (UniqueName: \"kubernetes.io/projected/d08f8b17-02c1-4000-a5a5-0e53b209472c-kube-api-access-bprg2\") pod \"keystone-operator-controller-manager-6978b79747-mgpfr\" (UID: \"d08f8b17-02c1-4000-a5a5-0e53b209472c\") " pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.628221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvhxf\" (UniqueName: \"kubernetes.io/projected/b3ba0ea4-04ca-478e-ae2c-ebcbf3353362-kube-api-access-qvhxf\") pod \"manila-operator-controller-manager-765668569f-p2dr2\" (UID: \"b3ba0ea4-04ca-478e-ae2c-ebcbf3353362\") " pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.628314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjktv\" (UniqueName: \"kubernetes.io/projected/4c6ce987-6e7a-4c86-9269-115e16d51c4d-kube-api-access-xjktv\") pod \"neutron-operator-controller-manager-694c5bfc85-cl2lx\" (UID: \"4c6ce987-6e7a-4c86-9269-115e16d51c4d\") " pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.633200 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.634481 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.635630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.638721 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-j8p5t" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.647010 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.648545 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.652145 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-b844d" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.654302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bprg2\" (UniqueName: \"kubernetes.io/projected/d08f8b17-02c1-4000-a5a5-0e53b209472c-kube-api-access-bprg2\") pod \"keystone-operator-controller-manager-6978b79747-mgpfr\" (UID: \"d08f8b17-02c1-4000-a5a5-0e53b209472c\") " pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.667983 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.668776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.669240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvhxf\" (UniqueName: \"kubernetes.io/projected/b3ba0ea4-04ca-478e-ae2c-ebcbf3353362-kube-api-access-qvhxf\") pod \"manila-operator-controller-manager-765668569f-p2dr2\" (UID: \"b3ba0ea4-04ca-478e-ae2c-ebcbf3353362\") " pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.730387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xphvp\" (UniqueName: \"kubernetes.io/projected/ef2c3378-a1f3-4366-be22-66e12e71dcb2-kube-api-access-xphvp\") pod \"mariadb-operator-controller-manager-67bf948998-npbds\" (UID: \"ef2c3378-a1f3-4366-be22-66e12e71dcb2\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.730528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6ld5\" (UniqueName: \"kubernetes.io/projected/c5b52880-267a-4211-a94d-09d132976cf3-kube-api-access-r6ld5\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.730561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.730640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrl4f\" (UniqueName: \"kubernetes.io/projected/bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b-kube-api-access-wrl4f\") pod \"nova-operator-controller-manager-ddcbfd695-kq24f\" (UID: \"bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.730701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjktv\" (UniqueName: \"kubernetes.io/projected/4c6ce987-6e7a-4c86-9269-115e16d51c4d-kube-api-access-xjktv\") pod \"neutron-operator-controller-manager-694c5bfc85-cl2lx\" (UID: \"4c6ce987-6e7a-4c86-9269-115e16d51c4d\") " pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.730743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4v9\" (UniqueName: \"kubernetes.io/projected/b52dfbde-2120-4054-974f-5992f89c9811-kube-api-access-dg4v9\") pod \"octavia-operator-controller-manager-5c765b4558-hgs42\" (UID: \"b52dfbde-2120-4054-974f-5992f89c9811\") " pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.730772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7nt2\" (UniqueName: \"kubernetes.io/projected/c102c3d5-9654-48bf-be4f-6cb41b1f8d7a-kube-api-access-v7nt2\") pod \"ovn-operator-controller-manager-788c46999f-rbvsr\" (UID: \"c102c3d5-9654-48bf-be4f-6cb41b1f8d7a\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.731348 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.732847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.736107 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-28lff" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.754417 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.760087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.776665 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.786068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjktv\" (UniqueName: \"kubernetes.io/projected/4c6ce987-6e7a-4c86-9269-115e16d51c4d-kube-api-access-xjktv\") pod \"neutron-operator-controller-manager-694c5bfc85-cl2lx\" (UID: \"4c6ce987-6e7a-4c86-9269-115e16d51c4d\") " pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.795102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xphvp\" (UniqueName: \"kubernetes.io/projected/ef2c3378-a1f3-4366-be22-66e12e71dcb2-kube-api-access-xphvp\") pod \"mariadb-operator-controller-manager-67bf948998-npbds\" (UID: \"ef2c3378-a1f3-4366-be22-66e12e71dcb2\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.808537 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.830568 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.830704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.832694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwgc7\" (UniqueName: \"kubernetes.io/projected/a607ca6c-b867-42b6-b9ad-d5941671b685-kube-api-access-xwgc7\") pod \"placement-operator-controller-manager-5b964cf4cd-pqzkn\" (UID: \"a607ca6c-b867-42b6-b9ad-d5941671b685\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.832767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4v9\" (UniqueName: \"kubernetes.io/projected/b52dfbde-2120-4054-974f-5992f89c9811-kube-api-access-dg4v9\") pod \"octavia-operator-controller-manager-5c765b4558-hgs42\" (UID: \"b52dfbde-2120-4054-974f-5992f89c9811\") " pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.832794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7nt2\" (UniqueName: \"kubernetes.io/projected/c102c3d5-9654-48bf-be4f-6cb41b1f8d7a-kube-api-access-v7nt2\") pod \"ovn-operator-controller-manager-788c46999f-rbvsr\" (UID: \"c102c3d5-9654-48bf-be4f-6cb41b1f8d7a\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.832835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5qcz\" (UniqueName: \"kubernetes.io/projected/66c3a9b0-5edc-468c-bf0e-309cac8be928-kube-api-access-h5qcz\") pod \"swift-operator-controller-manager-68fc8c869-5szzt\" (UID: \"66c3a9b0-5edc-468c-bf0e-309cac8be928\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.832915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6ld5\" (UniqueName: \"kubernetes.io/projected/c5b52880-267a-4211-a94d-09d132976cf3-kube-api-access-r6ld5\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.832962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.833003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrl4f\" (UniqueName: \"kubernetes.io/projected/bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b-kube-api-access-wrl4f\") pod \"nova-operator-controller-manager-ddcbfd695-kq24f\" (UID: \"bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" Jan 28 18:53:59 crc kubenswrapper[4749]: E0128 18:53:59.833665 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:53:59 crc kubenswrapper[4749]: E0128 18:53:59.833710 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert podName:c5b52880-267a-4211-a94d-09d132976cf3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:00.333696436 +0000 UTC m=+1108.345223211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" (UID: "c5b52880-267a-4211-a94d-09d132976cf3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.841378 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.842535 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.851667 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-52cb6" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.858906 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.862412 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-68djt" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.872032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7nt2\" (UniqueName: \"kubernetes.io/projected/c102c3d5-9654-48bf-be4f-6cb41b1f8d7a-kube-api-access-v7nt2\") pod \"ovn-operator-controller-manager-788c46999f-rbvsr\" (UID: \"c102c3d5-9654-48bf-be4f-6cb41b1f8d7a\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.889769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.898265 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.914079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrl4f\" (UniqueName: \"kubernetes.io/projected/bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b-kube-api-access-wrl4f\") pod \"nova-operator-controller-manager-ddcbfd695-kq24f\" (UID: \"bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.914863 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6ld5\" (UniqueName: \"kubernetes.io/projected/c5b52880-267a-4211-a94d-09d132976cf3-kube-api-access-r6ld5\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.919394 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.921544 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.923227 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp"] Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.924432 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-k2qhs" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.935212 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwgc7\" (UniqueName: \"kubernetes.io/projected/a607ca6c-b867-42b6-b9ad-d5941671b685-kube-api-access-xwgc7\") pod \"placement-operator-controller-manager-5b964cf4cd-pqzkn\" (UID: \"a607ca6c-b867-42b6-b9ad-d5941671b685\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.935291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qcz\" (UniqueName: \"kubernetes.io/projected/66c3a9b0-5edc-468c-bf0e-309cac8be928-kube-api-access-h5qcz\") pod \"swift-operator-controller-manager-68fc8c869-5szzt\" (UID: \"66c3a9b0-5edc-468c-bf0e-309cac8be928\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.935366 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghnm\" (UniqueName: \"kubernetes.io/projected/87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c-kube-api-access-jghnm\") pod \"telemetry-operator-controller-manager-877d65859-6skzm\" (UID: \"87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c\") " pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.935403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdvvj\" (UniqueName: \"kubernetes.io/projected/8780976f-c471-4ba4-956c-33ea9089437f-kube-api-access-wdvvj\") pod \"test-operator-controller-manager-56f8bfcd9f-x28dk\" (UID: \"8780976f-c471-4ba4-956c-33ea9089437f\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.936119 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.942941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4v9\" (UniqueName: \"kubernetes.io/projected/b52dfbde-2120-4054-974f-5992f89c9811-kube-api-access-dg4v9\") pod \"octavia-operator-controller-manager-5c765b4558-hgs42\" (UID: \"b52dfbde-2120-4054-974f-5992f89c9811\") " pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.969509 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" Jan 28 18:53:59 crc kubenswrapper[4749]: I0128 18:53:59.988854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.015111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwgc7\" (UniqueName: \"kubernetes.io/projected/a607ca6c-b867-42b6-b9ad-d5941671b685-kube-api-access-xwgc7\") pod \"placement-operator-controller-manager-5b964cf4cd-pqzkn\" (UID: \"a607ca6c-b867-42b6-b9ad-d5941671b685\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.025492 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.040461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jghnm\" (UniqueName: \"kubernetes.io/projected/87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c-kube-api-access-jghnm\") pod \"telemetry-operator-controller-manager-877d65859-6skzm\" (UID: \"87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c\") " pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.040584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdvvj\" (UniqueName: \"kubernetes.io/projected/8780976f-c471-4ba4-956c-33ea9089437f-kube-api-access-wdvvj\") pod \"test-operator-controller-manager-56f8bfcd9f-x28dk\" (UID: \"8780976f-c471-4ba4-956c-33ea9089437f\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.040682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.040865 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbrq\" (UniqueName: \"kubernetes.io/projected/cd842919-01b8-4886-914e-2eb9e731f65f-kube-api-access-4wbrq\") pod \"watcher-operator-controller-manager-767b8bc766-8kcjp\" (UID: \"cd842919-01b8-4886-914e-2eb9e731f65f\") " pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.041568 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.041637 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert podName:76e8a2e5-b9cd-4114-af08-d767fb2ae45d nodeName:}" failed. No retries permitted until 2026-01-28 18:54:01.041617047 +0000 UTC m=+1109.053143822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert") pod "infra-operator-controller-manager-79955696d6-4t7fn" (UID: "76e8a2e5-b9cd-4114-af08-d767fb2ae45d") : secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.065571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5qcz\" (UniqueName: \"kubernetes.io/projected/66c3a9b0-5edc-468c-bf0e-309cac8be928-kube-api-access-h5qcz\") pod \"swift-operator-controller-manager-68fc8c869-5szzt\" (UID: \"66c3a9b0-5edc-468c-bf0e-309cac8be928\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.066426 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.076761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghnm\" (UniqueName: \"kubernetes.io/projected/87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c-kube-api-access-jghnm\") pod \"telemetry-operator-controller-manager-877d65859-6skzm\" (UID: \"87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c\") " pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.086423 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdvvj\" (UniqueName: \"kubernetes.io/projected/8780976f-c471-4ba4-956c-33ea9089437f-kube-api-access-wdvvj\") pod \"test-operator-controller-manager-56f8bfcd9f-x28dk\" (UID: \"8780976f-c471-4ba4-956c-33ea9089437f\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.094749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.099298 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p"] Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.100315 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.104581 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.104738 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.105009 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kbpz9" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.111545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p"] Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.136081 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.139627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.141990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbrq\" (UniqueName: \"kubernetes.io/projected/cd842919-01b8-4886-914e-2eb9e731f65f-kube-api-access-4wbrq\") pod \"watcher-operator-controller-manager-767b8bc766-8kcjp\" (UID: \"cd842919-01b8-4886-914e-2eb9e731f65f\") " pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.143397 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs"] Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.145392 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.152156 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs"] Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.155488 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-l9kss" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.161903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbrq\" (UniqueName: \"kubernetes.io/projected/cd842919-01b8-4886-914e-2eb9e731f65f-kube-api-access-4wbrq\") pod \"watcher-operator-controller-manager-767b8bc766-8kcjp\" (UID: \"cd842919-01b8-4886-914e-2eb9e731f65f\") " pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.243494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.243582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzznp\" (UniqueName: \"kubernetes.io/projected/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-kube-api-access-hzznp\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.243690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlllt\" (UniqueName: \"kubernetes.io/projected/0f56d027-e393-4583-ab10-8dc8c1046027-kube-api-access-hlllt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ssdhs\" (UID: \"0f56d027-e393-4583-ab10-8dc8c1046027\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.243744 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.345726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.345816 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlllt\" (UniqueName: \"kubernetes.io/projected/0f56d027-e393-4583-ab10-8dc8c1046027-kube-api-access-hlllt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ssdhs\" (UID: \"0f56d027-e393-4583-ab10-8dc8c1046027\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.345880 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.345936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.345950 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.345987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzznp\" (UniqueName: \"kubernetes.io/projected/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-kube-api-access-hzznp\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.346032 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert podName:c5b52880-267a-4211-a94d-09d132976cf3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:01.346009963 +0000 UTC m=+1109.357536738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" (UID: "c5b52880-267a-4211-a94d-09d132976cf3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.346467 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.346496 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.346533 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:00.846514436 +0000 UTC m=+1108.858041221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "metrics-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.346557 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:00.846546757 +0000 UTC m=+1108.858073662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "webhook-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.368235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlllt\" (UniqueName: \"kubernetes.io/projected/0f56d027-e393-4583-ab10-8dc8c1046027-kube-api-access-hlllt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ssdhs\" (UID: \"0f56d027-e393-4583-ab10-8dc8c1046027\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.369962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzznp\" (UniqueName: \"kubernetes.io/projected/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-kube-api-access-hzznp\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.388730 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57"] Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.451912 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.480382 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.550478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g"] Jan 28 18:54:00 crc kubenswrapper[4749]: W0128 18:54:00.579461 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb99425c5_bab7_48bb_a341_e1c160eac631.slice/crio-cb2c40fa8de70993f0d12a9033bc5bef654335d3706f073e6735cf23a0f9e777 WatchSource:0}: Error finding container cb2c40fa8de70993f0d12a9033bc5bef654335d3706f073e6735cf23a0f9e777: Status 404 returned error can't find the container with id cb2c40fa8de70993f0d12a9033bc5bef654335d3706f073e6735cf23a0f9e777 Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.858062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: I0128 18:54:00.859471 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.858286 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.859733 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:01.859717664 +0000 UTC m=+1109.871244439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "metrics-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.859675 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 18:54:00 crc kubenswrapper[4749]: E0128 18:54:00.859764 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:01.859757715 +0000 UTC m=+1109.871284490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "webhook-server-cert" not found Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.027832 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9"] Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.062257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:01 crc kubenswrapper[4749]: E0128 18:54:01.062441 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:01 crc kubenswrapper[4749]: E0128 18:54:01.062492 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert podName:76e8a2e5-b9cd-4114-af08-d767fb2ae45d nodeName:}" failed. No retries permitted until 2026-01-28 18:54:03.062477058 +0000 UTC m=+1111.074003823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert") pod "infra-operator-controller-manager-79955696d6-4t7fn" (UID: "76e8a2e5-b9cd-4114-af08-d767fb2ae45d") : secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.064140 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-958664b5-2x98l"] Jan 28 18:54:01 crc kubenswrapper[4749]: W0128 18:54:01.128018 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9749144e_8c20_470f_97f1_c450d9520c07.slice/crio-91156b069dd1fd480818d998794038725c1fceb21d5cc565a4156b60dfa7418f WatchSource:0}: Error finding container 91156b069dd1fd480818d998794038725c1fceb21d5cc565a4156b60dfa7418f: Status 404 returned error can't find the container with id 91156b069dd1fd480818d998794038725c1fceb21d5cc565a4156b60dfa7418f Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.216828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86"] Jan 28 18:54:01 crc kubenswrapper[4749]: W0128 18:54:01.285491 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce60ace7_ac44_45a5_9422_dade4b147417.slice/crio-51d08f0aa19f745ad63e54e3d4a6894961060266dc8c44c83ab7d9c89294d645 WatchSource:0}: Error finding container 51d08f0aa19f745ad63e54e3d4a6894961060266dc8c44c83ab7d9c89294d645: Status 404 returned error can't find the container with id 51d08f0aa19f745ad63e54e3d4a6894961060266dc8c44c83ab7d9c89294d645 Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.374165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:01 crc kubenswrapper[4749]: E0128 18:54:01.374382 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:01 crc kubenswrapper[4749]: E0128 18:54:01.374448 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert podName:c5b52880-267a-4211-a94d-09d132976cf3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:03.374428871 +0000 UTC m=+1111.385955646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" (UID: "c5b52880-267a-4211-a94d-09d132976cf3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.444565 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb"] Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.507061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" event={"ID":"aeea4bd2-a2a5-451b-9a40-02881d5901a0","Type":"ContainerStarted","Data":"09f0099306d3cc323a33306251f2e4094d99b903868a47caacfb33d2aa5315fe"} Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.508197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" event={"ID":"9749144e-8c20-470f-97f1-c450d9520c07","Type":"ContainerStarted","Data":"91156b069dd1fd480818d998794038725c1fceb21d5cc565a4156b60dfa7418f"} Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.509215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" event={"ID":"b99425c5-bab7-48bb-a341-e1c160eac631","Type":"ContainerStarted","Data":"cb2c40fa8de70993f0d12a9033bc5bef654335d3706f073e6735cf23a0f9e777"} Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.510138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" event={"ID":"8e193a26-a0a4-48e1-a3bf-13b52f809e3e","Type":"ContainerStarted","Data":"b8f49405f2b24751884f2a84836b7dbfb86ffd4ee57ed234b90bba0f15b8536c"} Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.510926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" event={"ID":"ce60ace7-ac44-45a5-9422-dade4b147417","Type":"ContainerStarted","Data":"51d08f0aa19f745ad63e54e3d4a6894961060266dc8c44c83ab7d9c89294d645"} Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.513262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" event={"ID":"b606ed73-e992-4755-ac52-4ace6b8b553c","Type":"ContainerStarted","Data":"2a934d6b687255400a79438943ccc0bb110dd7c9fcc44d2b19b3b419b264d88f"} Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.876992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk"] Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.888164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:01 crc kubenswrapper[4749]: I0128 18:54:01.888227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:01 crc kubenswrapper[4749]: E0128 18:54:01.888347 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 18:54:01 crc kubenswrapper[4749]: E0128 18:54:01.888399 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:03.888382658 +0000 UTC m=+1111.899909433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "metrics-server-cert" not found Jan 28 18:54:01 crc kubenswrapper[4749]: E0128 18:54:01.888469 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 18:54:01 crc kubenswrapper[4749]: E0128 18:54:01.888536 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:03.888516171 +0000 UTC m=+1111.900043026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "webhook-server-cert" not found Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.052155 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.102865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.109056 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.151975 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.168196 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr"] Jan 28 18:54:02 crc kubenswrapper[4749]: W0128 18:54:02.183341 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc102c3d5_9654_48bf_be4f_6cb41b1f8d7a.slice/crio-bc8f77f10b8ae6a9c8ee33ee34e5622281ea37abf9b8b57aba20b90872d7e55f WatchSource:0}: Error finding container bc8f77f10b8ae6a9c8ee33ee34e5622281ea37abf9b8b57aba20b90872d7e55f: Status 404 returned error can't find the container with id bc8f77f10b8ae6a9c8ee33ee34e5622281ea37abf9b8b57aba20b90872d7e55f Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.191493 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.308474 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn"] Jan 28 18:54:02 crc kubenswrapper[4749]: W0128 18:54:02.325295 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda607ca6c_b867_42b6_b9ad_d5941671b685.slice/crio-b674c930fefd81ad30695aa1ccc5b681003e21807da86fd35cadc20448b33fba WatchSource:0}: Error finding container b674c930fefd81ad30695aa1ccc5b681003e21807da86fd35cadc20448b33fba: Status 404 returned error can't find the container with id b674c930fefd81ad30695aa1ccc5b681003e21807da86fd35cadc20448b33fba Jan 28 18:54:02 crc kubenswrapper[4749]: W0128 18:54:02.335678 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbd77250_3bcc_4f6f_acc3_4e05f1b01c7b.slice/crio-79f4be7fb6acaf9193017b7d133394c254631cf75c2e1a77765bd99ede771325 WatchSource:0}: Error finding container 79f4be7fb6acaf9193017b7d133394c254631cf75c2e1a77765bd99ede771325: Status 404 returned error can't find the container with id 79f4be7fb6acaf9193017b7d133394c254631cf75c2e1a77765bd99ede771325 Jan 28 18:54:02 crc kubenswrapper[4749]: W0128 18:54:02.337635 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f56d027_e393_4583_ab10_8dc8c1046027.slice/crio-9ec7dca4b05fe0a94fdd627177590337b680b3dcf48d603c76e127113ff72623 WatchSource:0}: Error finding container 9ec7dca4b05fe0a94fdd627177590337b680b3dcf48d603c76e127113ff72623: Status 404 returned error can't find the container with id 9ec7dca4b05fe0a94fdd627177590337b680b3dcf48d603c76e127113ff72623 Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.338662 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.353192 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.359311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.542166 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.551780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" event={"ID":"c102c3d5-9654-48bf-be4f-6cb41b1f8d7a","Type":"ContainerStarted","Data":"bc8f77f10b8ae6a9c8ee33ee34e5622281ea37abf9b8b57aba20b90872d7e55f"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.554739 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-765668569f-p2dr2"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.559668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" event={"ID":"0f56d027-e393-4583-ab10-8dc8c1046027","Type":"ContainerStarted","Data":"9ec7dca4b05fe0a94fdd627177590337b680b3dcf48d603c76e127113ff72623"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.560864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" event={"ID":"cd842919-01b8-4886-914e-2eb9e731f65f","Type":"ContainerStarted","Data":"aca43e049ee77c516de180927791b06c292ea752a53dfc94dfd4b0ee8f97c2da"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.561628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" event={"ID":"66c3a9b0-5edc-468c-bf0e-309cac8be928","Type":"ContainerStarted","Data":"6984d2ade5523d4ceaed82ff5c5da6dc4c586e7bca9f9a9c0a475b23efaba597"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.562313 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" event={"ID":"8780976f-c471-4ba4-956c-33ea9089437f","Type":"ContainerStarted","Data":"012492c5cebfde73a8fd81c3e4f431366dda92df69ad7f64a7b5c0c35b5c59cd"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.564670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" event={"ID":"ef2c3378-a1f3-4366-be22-66e12e71dcb2","Type":"ContainerStarted","Data":"cabf3d3e9cfec4751619b343142e1e85a3637d6e8dd2d2bea8281cdc2d6248a3"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.565555 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm"] Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.567941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" event={"ID":"a607ca6c-b867-42b6-b9ad-d5941671b685","Type":"ContainerStarted","Data":"b674c930fefd81ad30695aa1ccc5b681003e21807da86fd35cadc20448b33fba"} Jan 28 18:54:02 crc kubenswrapper[4749]: E0128 18:54:02.570334 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:774b657c4a2d169eb939c51d71a146bf4a44e93b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jghnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-877d65859-6skzm_openstack-operators(87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 18:54:02 crc kubenswrapper[4749]: E0128 18:54:02.571788 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" podUID="87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c" Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.571921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" event={"ID":"b553f796-28d7-455e-92bb-05bbc30a9a27","Type":"ContainerStarted","Data":"03836369bc282428b58a00271608ebd1252bec6e1e0a0717390fcf252d02b6cc"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.572883 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" event={"ID":"b52dfbde-2120-4054-974f-5992f89c9811","Type":"ContainerStarted","Data":"269cf9354e82d6dfc21f94e2b4fef4e764fb0d0d88540de5ba3c10a56b5478f6"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.576035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" event={"ID":"d08f8b17-02c1-4000-a5a5-0e53b209472c","Type":"ContainerStarted","Data":"bed1630eadbd9309894b1e7e4a718c8fba494497321165c2f251bf36cf71104d"} Jan 28 18:54:02 crc kubenswrapper[4749]: I0128 18:54:02.580179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" event={"ID":"bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b","Type":"ContainerStarted","Data":"79f4be7fb6acaf9193017b7d133394c254631cf75c2e1a77765bd99ede771325"} Jan 28 18:54:03 crc kubenswrapper[4749]: I0128 18:54:03.120876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.121166 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.121222 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert podName:76e8a2e5-b9cd-4114-af08-d767fb2ae45d nodeName:}" failed. No retries permitted until 2026-01-28 18:54:07.121204809 +0000 UTC m=+1115.132731584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert") pod "infra-operator-controller-manager-79955696d6-4t7fn" (UID: "76e8a2e5-b9cd-4114-af08-d767fb2ae45d") : secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:03 crc kubenswrapper[4749]: I0128 18:54:03.429384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.430182 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.430265 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert podName:c5b52880-267a-4211-a94d-09d132976cf3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:07.4302326 +0000 UTC m=+1115.441759375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" (UID: "c5b52880-267a-4211-a94d-09d132976cf3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:03 crc kubenswrapper[4749]: I0128 18:54:03.592692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" event={"ID":"4c6ce987-6e7a-4c86-9269-115e16d51c4d","Type":"ContainerStarted","Data":"db1ede0733ed01f4efa482495695b4456a28b3c6dd32b41613fead2ac1ed3330"} Jan 28 18:54:03 crc kubenswrapper[4749]: I0128 18:54:03.594309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" event={"ID":"b3ba0ea4-04ca-478e-ae2c-ebcbf3353362","Type":"ContainerStarted","Data":"aeb5d0c716bb2a54b73f38a1b707a40dbd9e846c86e8f44735d22adcbec4b662"} Jan 28 18:54:03 crc kubenswrapper[4749]: I0128 18:54:03.595230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" event={"ID":"87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c","Type":"ContainerStarted","Data":"e8cd8d5439fd433fa0bcbbc759bc09441a7d9a041b70e2d73a059044c5c4fbc9"} Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.598756 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:774b657c4a2d169eb939c51d71a146bf4a44e93b\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" podUID="87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c" Jan 28 18:54:03 crc kubenswrapper[4749]: I0128 18:54:03.937861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.937986 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.938048 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:07.938033447 +0000 UTC m=+1115.949560222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "webhook-server-cert" not found Jan 28 18:54:03 crc kubenswrapper[4749]: I0128 18:54:03.938590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.938675 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 18:54:03 crc kubenswrapper[4749]: E0128 18:54:03.938802 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:07.938790505 +0000 UTC m=+1115.950317280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "metrics-server-cert" not found Jan 28 18:54:04 crc kubenswrapper[4749]: E0128 18:54:04.604473 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:774b657c4a2d169eb939c51d71a146bf4a44e93b\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" podUID="87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c" Jan 28 18:54:07 crc kubenswrapper[4749]: I0128 18:54:07.198437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:07 crc kubenswrapper[4749]: E0128 18:54:07.198765 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:07 crc kubenswrapper[4749]: E0128 18:54:07.199012 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert podName:76e8a2e5-b9cd-4114-af08-d767fb2ae45d nodeName:}" failed. No retries permitted until 2026-01-28 18:54:15.198979849 +0000 UTC m=+1123.210506624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert") pod "infra-operator-controller-manager-79955696d6-4t7fn" (UID: "76e8a2e5-b9cd-4114-af08-d767fb2ae45d") : secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:07 crc kubenswrapper[4749]: I0128 18:54:07.505938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:07 crc kubenswrapper[4749]: E0128 18:54:07.506195 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:07 crc kubenswrapper[4749]: E0128 18:54:07.506284 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert podName:c5b52880-267a-4211-a94d-09d132976cf3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:15.506265346 +0000 UTC m=+1123.517792121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" (UID: "c5b52880-267a-4211-a94d-09d132976cf3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:08 crc kubenswrapper[4749]: I0128 18:54:08.015588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:08 crc kubenswrapper[4749]: I0128 18:54:08.015944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:08 crc kubenswrapper[4749]: E0128 18:54:08.015778 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 18:54:08 crc kubenswrapper[4749]: E0128 18:54:08.016273 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:16.016251346 +0000 UTC m=+1124.027778121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "metrics-server-cert" not found Jan 28 18:54:08 crc kubenswrapper[4749]: E0128 18:54:08.016199 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 18:54:08 crc kubenswrapper[4749]: E0128 18:54:08.017313 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:16.017297022 +0000 UTC m=+1124.028823797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "webhook-server-cert" not found Jan 28 18:54:13 crc kubenswrapper[4749]: E0128 18:54:13.922632 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3899449467/1\": happened during read: context canceled" image="quay.io/lmiccini/glance-operator@sha256:8a7e2637765333c555b0b932c2bfc789235aea2c7276961657a03ef1352a7264" Jan 28 18:54:13 crc kubenswrapper[4749]: E0128 18:54:13.923547 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/glance-operator@sha256:8a7e2637765333c555b0b932c2bfc789235aea2c7276961657a03ef1352a7264,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fz8zh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-6db5dbd896-q7r57_openstack-operators(8e193a26-a0a4-48e1-a3bf-13b52f809e3e): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3899449467/1\": happened during read: context canceled" logger="UnhandledError" Jan 28 18:54:13 crc kubenswrapper[4749]: E0128 18:54:13.924743 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage3899449467/1\\\": happened during read: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" podUID="8e193a26-a0a4-48e1-a3bf-13b52f809e3e" Jan 28 18:54:14 crc kubenswrapper[4749]: E0128 18:54:14.685612 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/glance-operator@sha256:8a7e2637765333c555b0b932c2bfc789235aea2c7276961657a03ef1352a7264\\\"\"" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" podUID="8e193a26-a0a4-48e1-a3bf-13b52f809e3e" Jan 28 18:54:15 crc kubenswrapper[4749]: I0128 18:54:15.268224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:15 crc kubenswrapper[4749]: E0128 18:54:15.268429 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:15 crc kubenswrapper[4749]: E0128 18:54:15.268500 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert podName:76e8a2e5-b9cd-4114-af08-d767fb2ae45d nodeName:}" failed. No retries permitted until 2026-01-28 18:54:31.268481456 +0000 UTC m=+1139.280008231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert") pod "infra-operator-controller-manager-79955696d6-4t7fn" (UID: "76e8a2e5-b9cd-4114-af08-d767fb2ae45d") : secret "infra-operator-webhook-server-cert" not found Jan 28 18:54:15 crc kubenswrapper[4749]: I0128 18:54:15.573213 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:15 crc kubenswrapper[4749]: E0128 18:54:15.573418 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:15 crc kubenswrapper[4749]: E0128 18:54:15.573478 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert podName:c5b52880-267a-4211-a94d-09d132976cf3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:31.573461018 +0000 UTC m=+1139.584987793 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" (UID: "c5b52880-267a-4211-a94d-09d132976cf3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 18:54:16 crc kubenswrapper[4749]: I0128 18:54:16.084556 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:16 crc kubenswrapper[4749]: I0128 18:54:16.084654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:16 crc kubenswrapper[4749]: E0128 18:54:16.085152 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 18:54:16 crc kubenswrapper[4749]: E0128 18:54:16.085882 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs podName:5ec9e008-61f0-4635-8e4c-2ae6fe098fc3 nodeName:}" failed. No retries permitted until 2026-01-28 18:54:32.085311264 +0000 UTC m=+1140.096838099 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs") pod "openstack-operator-controller-manager-798d8549d8-9kb8p" (UID: "5ec9e008-61f0-4635-8e4c-2ae6fe098fc3") : secret "metrics-server-cert" not found Jan 28 18:54:16 crc kubenswrapper[4749]: I0128 18:54:16.091408 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-webhook-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:19 crc kubenswrapper[4749]: E0128 18:54:19.685385 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/barbican-operator@sha256:eae1fc0ecdfc4f0bef5a980affa60155a5baacf1bdaaeeb18d9c2680f762bc9d" Jan 28 18:54:19 crc kubenswrapper[4749]: E0128 18:54:19.686177 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/barbican-operator@sha256:eae1fc0ecdfc4f0bef5a980affa60155a5baacf1bdaaeeb18d9c2680f762bc9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lhcvl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6bc7f4f4cf-7rcdb_openstack-operators(b606ed73-e992-4755-ac52-4ace6b8b553c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:19 crc kubenswrapper[4749]: E0128 18:54:19.687400 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" podUID="b606ed73-e992-4755-ac52-4ace6b8b553c" Jan 28 18:54:19 crc kubenswrapper[4749]: E0128 18:54:19.722864 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/barbican-operator@sha256:eae1fc0ecdfc4f0bef5a980affa60155a5baacf1bdaaeeb18d9c2680f762bc9d\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" podUID="b606ed73-e992-4755-ac52-4ace6b8b553c" Jan 28 18:54:25 crc kubenswrapper[4749]: E0128 18:54:25.005888 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/ironic-operator@sha256:5f48b6af05a584d3da5c973f83195d999cc151aa0f187cabc8002cb46d60afe5" Jan 28 18:54:25 crc kubenswrapper[4749]: E0128 18:54:25.006562 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/ironic-operator@sha256:5f48b6af05a584d3da5c973f83195d999cc151aa0f187cabc8002cb46d60afe5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dgsqx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-958664b5-2x98l_openstack-operators(9749144e-8c20-470f-97f1-c450d9520c07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:25 crc kubenswrapper[4749]: E0128 18:54:25.008000 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" podUID="9749144e-8c20-470f-97f1-c450d9520c07" Jan 28 18:54:25 crc kubenswrapper[4749]: E0128 18:54:25.952895 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/ironic-operator@sha256:5f48b6af05a584d3da5c973f83195d999cc151aa0f187cabc8002cb46d60afe5\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" podUID="9749144e-8c20-470f-97f1-c450d9520c07" Jan 28 18:54:26 crc kubenswrapper[4749]: E0128 18:54:26.492167 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf" Jan 28 18:54:26 crc kubenswrapper[4749]: E0128 18:54:26.492613 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xphvp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf948998-npbds_openstack-operators(ef2c3378-a1f3-4366-be22-66e12e71dcb2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:26 crc kubenswrapper[4749]: E0128 18:54:26.493784 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" podUID="ef2c3378-a1f3-4366-be22-66e12e71dcb2" Jan 28 18:54:26 crc kubenswrapper[4749]: E0128 18:54:26.776526 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" podUID="ef2c3378-a1f3-4366-be22-66e12e71dcb2" Jan 28 18:54:27 crc kubenswrapper[4749]: I0128 18:54:27.467172 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:54:27 crc kubenswrapper[4749]: I0128 18:54:27.467236 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:54:27 crc kubenswrapper[4749]: I0128 18:54:27.467290 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:54:27 crc kubenswrapper[4749]: I0128 18:54:27.468169 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ffd739e178035a0a80263ddfa883436d23618669a6ffd6b8554f99da5a12189b"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 18:54:27 crc kubenswrapper[4749]: I0128 18:54:27.468250 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://ffd739e178035a0a80263ddfa883436d23618669a6ffd6b8554f99da5a12189b" gracePeriod=600 Jan 28 18:54:28 crc kubenswrapper[4749]: E0128 18:54:28.293255 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241" Jan 28 18:54:28 crc kubenswrapper[4749]: E0128 18:54:28.293980 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdvvj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-x28dk_openstack-operators(8780976f-c471-4ba4-956c-33ea9089437f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:28 crc kubenswrapper[4749]: E0128 18:54:28.295930 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" podUID="8780976f-c471-4ba4-956c-33ea9089437f" Jan 28 18:54:28 crc kubenswrapper[4749]: I0128 18:54:28.804028 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="ffd739e178035a0a80263ddfa883436d23618669a6ffd6b8554f99da5a12189b" exitCode=0 Jan 28 18:54:28 crc kubenswrapper[4749]: I0128 18:54:28.805319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"ffd739e178035a0a80263ddfa883436d23618669a6ffd6b8554f99da5a12189b"} Jan 28 18:54:28 crc kubenswrapper[4749]: I0128 18:54:28.805377 4749 scope.go:117] "RemoveContainer" containerID="4e5b313040fcc5d2f4a0e0713dba32c28c08f86a27e8cecbbc5d364a34a7eb3e" Jan 28 18:54:28 crc kubenswrapper[4749]: E0128 18:54:28.806319 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" podUID="8780976f-c471-4ba4-956c-33ea9089437f" Jan 28 18:54:29 crc kubenswrapper[4749]: E0128 18:54:29.526142 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/watcher-operator@sha256:35f1eb96f42069bb8f7c33942fb86b41843ba02803464245c16192ccda3d50e4" Jan 28 18:54:29 crc kubenswrapper[4749]: E0128 18:54:29.526682 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/watcher-operator@sha256:35f1eb96f42069bb8f7c33942fb86b41843ba02803464245c16192ccda3d50e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4wbrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-767b8bc766-8kcjp_openstack-operators(cd842919-01b8-4886-914e-2eb9e731f65f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:29 crc kubenswrapper[4749]: E0128 18:54:29.527959 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" podUID="cd842919-01b8-4886-914e-2eb9e731f65f" Jan 28 18:54:29 crc kubenswrapper[4749]: E0128 18:54:29.813707 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/watcher-operator@sha256:35f1eb96f42069bb8f7c33942fb86b41843ba02803464245c16192ccda3d50e4\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" podUID="cd842919-01b8-4886-914e-2eb9e731f65f" Jan 28 18:54:30 crc kubenswrapper[4749]: E0128 18:54:30.104988 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/heat-operator@sha256:429171b44a24e9e4dde46465d90a272d93b15317ea386184d6ad077cc119d3c9" Jan 28 18:54:30 crc kubenswrapper[4749]: E0128 18:54:30.105444 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/heat-operator@sha256:429171b44a24e9e4dde46465d90a272d93b15317ea386184d6ad077cc119d3c9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mhz2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-587c6bfdcf-zbk86_openstack-operators(ce60ace7-ac44-45a5-9422-dade4b147417): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:30 crc kubenswrapper[4749]: E0128 18:54:30.108741 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" podUID="ce60ace7-ac44-45a5-9422-dade4b147417" Jan 28 18:54:30 crc kubenswrapper[4749]: E0128 18:54:30.657453 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/octavia-operator@sha256:c7804813a3bba8910a47a5f32bd528335e18397f93cf5f7e7181d3d2c209b59b" Jan 28 18:54:30 crc kubenswrapper[4749]: E0128 18:54:30.657679 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/octavia-operator@sha256:c7804813a3bba8910a47a5f32bd528335e18397f93cf5f7e7181d3d2c209b59b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dg4v9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5c765b4558-hgs42_openstack-operators(b52dfbde-2120-4054-974f-5992f89c9811): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:30 crc kubenswrapper[4749]: E0128 18:54:30.658882 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" podUID="b52dfbde-2120-4054-974f-5992f89c9811" Jan 28 18:54:30 crc kubenswrapper[4749]: E0128 18:54:30.822689 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/heat-operator@sha256:429171b44a24e9e4dde46465d90a272d93b15317ea386184d6ad077cc119d3c9\\\"\"" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" podUID="ce60ace7-ac44-45a5-9422-dade4b147417" Jan 28 18:54:30 crc kubenswrapper[4749]: E0128 18:54:30.822770 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/octavia-operator@sha256:c7804813a3bba8910a47a5f32bd528335e18397f93cf5f7e7181d3d2c209b59b\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" podUID="b52dfbde-2120-4054-974f-5992f89c9811" Jan 28 18:54:31 crc kubenswrapper[4749]: E0128 18:54:31.225707 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/designate-operator@sha256:29a3092217e72f1ec8a163ed3d15a0a5ccc5b3117e64c72bf5e68597cc233b3d" Jan 28 18:54:31 crc kubenswrapper[4749]: E0128 18:54:31.225911 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/designate-operator@sha256:29a3092217e72f1ec8a163ed3d15a0a5ccc5b3117e64c72bf5e68597cc233b3d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-25tkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66dfbd6f5d-n667g_openstack-operators(b99425c5-bab7-48bb-a341-e1c160eac631): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:31 crc kubenswrapper[4749]: E0128 18:54:31.227383 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" podUID="b99425c5-bab7-48bb-a341-e1c160eac631" Jan 28 18:54:31 crc kubenswrapper[4749]: I0128 18:54:31.278244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:31 crc kubenswrapper[4749]: I0128 18:54:31.288433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76e8a2e5-b9cd-4114-af08-d767fb2ae45d-cert\") pod \"infra-operator-controller-manager-79955696d6-4t7fn\" (UID: \"76e8a2e5-b9cd-4114-af08-d767fb2ae45d\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:31 crc kubenswrapper[4749]: I0128 18:54:31.345301 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:31 crc kubenswrapper[4749]: I0128 18:54:31.585221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:31 crc kubenswrapper[4749]: I0128 18:54:31.590466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b52880-267a-4211-a94d-09d132976cf3-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d6clds\" (UID: \"c5b52880-267a-4211-a94d-09d132976cf3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:31 crc kubenswrapper[4749]: I0128 18:54:31.824546 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:31 crc kubenswrapper[4749]: E0128 18:54:31.832538 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/designate-operator@sha256:29a3092217e72f1ec8a163ed3d15a0a5ccc5b3117e64c72bf5e68597cc233b3d\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" podUID="b99425c5-bab7-48bb-a341-e1c160eac631" Jan 28 18:54:32 crc kubenswrapper[4749]: I0128 18:54:32.096261 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:32 crc kubenswrapper[4749]: I0128 18:54:32.103634 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ec9e008-61f0-4635-8e4c-2ae6fe098fc3-metrics-certs\") pod \"openstack-operator-controller-manager-798d8549d8-9kb8p\" (UID: \"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3\") " pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:32 crc kubenswrapper[4749]: E0128 18:54:32.139099 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/neutron-operator@sha256:22665b40ffeef62d1a612c1f9f0fa8e97ff95085fad123895d786b770f421fc0" Jan 28 18:54:32 crc kubenswrapper[4749]: E0128 18:54:32.139563 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/neutron-operator@sha256:22665b40ffeef62d1a612c1f9f0fa8e97ff95085fad123895d786b770f421fc0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xjktv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-694c5bfc85-cl2lx_openstack-operators(4c6ce987-6e7a-4c86-9269-115e16d51c4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:32 crc kubenswrapper[4749]: E0128 18:54:32.140897 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" podUID="4c6ce987-6e7a-4c86-9269-115e16d51c4d" Jan 28 18:54:32 crc kubenswrapper[4749]: I0128 18:54:32.269890 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:32 crc kubenswrapper[4749]: E0128 18:54:32.841421 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/neutron-operator@sha256:22665b40ffeef62d1a612c1f9f0fa8e97ff95085fad123895d786b770f421fc0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" podUID="4c6ce987-6e7a-4c86-9269-115e16d51c4d" Jan 28 18:54:39 crc kubenswrapper[4749]: E0128 18:54:39.254612 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/cinder-operator@sha256:6da7ec7bf701fe1dd489852a16429f163a69073fae67b872dca4b080cc3514ad" Jan 28 18:54:39 crc kubenswrapper[4749]: E0128 18:54:39.255231 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/cinder-operator@sha256:6da7ec7bf701fe1dd489852a16429f163a69073fae67b872dca4b080cc3514ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kpf75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-f6487bd57-5ztsh_openstack-operators(b553f796-28d7-455e-92bb-05bbc30a9a27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:39 crc kubenswrapper[4749]: E0128 18:54:39.256445 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" podUID="b553f796-28d7-455e-92bb-05bbc30a9a27" Jan 28 18:54:39 crc kubenswrapper[4749]: E0128 18:54:39.763703 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488" Jan 28 18:54:39 crc kubenswrapper[4749]: E0128 18:54:39.764301 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xwgc7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-pqzkn_openstack-operators(a607ca6c-b867-42b6-b9ad-d5941671b685): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:39 crc kubenswrapper[4749]: E0128 18:54:39.765999 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" podUID="a607ca6c-b867-42b6-b9ad-d5941671b685" Jan 28 18:54:39 crc kubenswrapper[4749]: E0128 18:54:39.895535 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/cinder-operator@sha256:6da7ec7bf701fe1dd489852a16429f163a69073fae67b872dca4b080cc3514ad\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" podUID="b553f796-28d7-455e-92bb-05bbc30a9a27" Jan 28 18:54:39 crc kubenswrapper[4749]: E0128 18:54:39.896194 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" podUID="a607ca6c-b867-42b6-b9ad-d5941671b685" Jan 28 18:54:41 crc kubenswrapper[4749]: E0128 18:54:41.110690 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/manila-operator@sha256:2e1a77365c3b08ff39892565abfc72b72e969f623e58a2663fb93890371fc9da" Jan 28 18:54:41 crc kubenswrapper[4749]: E0128 18:54:41.111045 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/manila-operator@sha256:2e1a77365c3b08ff39892565abfc72b72e969f623e58a2663fb93890371fc9da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qvhxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-765668569f-p2dr2_openstack-operators(b3ba0ea4-04ca-478e-ae2c-ebcbf3353362): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:41 crc kubenswrapper[4749]: E0128 18:54:41.112507 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" podUID="b3ba0ea4-04ca-478e-ae2c-ebcbf3353362" Jan 28 18:54:41 crc kubenswrapper[4749]: E0128 18:54:41.655697 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/keystone-operator@sha256:45ef0b95f941479535575b3d2cabb58a52e1d8490eed3da1bca9acd49344a722" Jan 28 18:54:41 crc kubenswrapper[4749]: E0128 18:54:41.655948 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/keystone-operator@sha256:45ef0b95f941479535575b3d2cabb58a52e1d8490eed3da1bca9acd49344a722,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bprg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-6978b79747-mgpfr_openstack-operators(d08f8b17-02c1-4000-a5a5-0e53b209472c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:41 crc kubenswrapper[4749]: E0128 18:54:41.657205 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" podUID="d08f8b17-02c1-4000-a5a5-0e53b209472c" Jan 28 18:54:41 crc kubenswrapper[4749]: E0128 18:54:41.924313 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/manila-operator@sha256:2e1a77365c3b08ff39892565abfc72b72e969f623e58a2663fb93890371fc9da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" podUID="b3ba0ea4-04ca-478e-ae2c-ebcbf3353362" Jan 28 18:54:41 crc kubenswrapper[4749]: E0128 18:54:41.924475 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/keystone-operator@sha256:45ef0b95f941479535575b3d2cabb58a52e1d8490eed3da1bca9acd49344a722\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" podUID="d08f8b17-02c1-4000-a5a5-0e53b209472c" Jan 28 18:54:42 crc kubenswrapper[4749]: E0128 18:54:42.140140 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 28 18:54:42 crc kubenswrapper[4749]: E0128 18:54:42.140387 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hlllt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ssdhs_openstack-operators(0f56d027-e393-4583-ab10-8dc8c1046027): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:42 crc kubenswrapper[4749]: E0128 18:54:42.141602 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" podUID="0f56d027-e393-4583-ab10-8dc8c1046027" Jan 28 18:54:42 crc kubenswrapper[4749]: E0128 18:54:42.931966 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" podUID="0f56d027-e393-4583-ab10-8dc8c1046027" Jan 28 18:54:43 crc kubenswrapper[4749]: E0128 18:54:43.143727 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/nova-operator@sha256:a992613466db3478a00c20c28639c4a12f6326aa52c40a418d1ec40038c83b61" Jan 28 18:54:43 crc kubenswrapper[4749]: E0128 18:54:43.144010 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/nova-operator@sha256:a992613466db3478a00c20c28639c4a12f6326aa52c40a418d1ec40038c83b61,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrl4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-ddcbfd695-kq24f_openstack-operators(bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:43 crc kubenswrapper[4749]: E0128 18:54:43.145295 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" podUID="bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b" Jan 28 18:54:43 crc kubenswrapper[4749]: E0128 18:54:43.293189 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:774b657c4a2d169eb939c51d71a146bf4a44e93b" Jan 28 18:54:43 crc kubenswrapper[4749]: E0128 18:54:43.293249 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:774b657c4a2d169eb939c51d71a146bf4a44e93b" Jan 28 18:54:43 crc kubenswrapper[4749]: E0128 18:54:43.293407 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:774b657c4a2d169eb939c51d71a146bf4a44e93b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jghnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-877d65859-6skzm_openstack-operators(87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:54:43 crc kubenswrapper[4749]: E0128 18:54:43.295429 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" podUID="87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c" Jan 28 18:54:43 crc kubenswrapper[4749]: I0128 18:54:43.862786 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p"] Jan 28 18:54:43 crc kubenswrapper[4749]: I0128 18:54:43.947469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"6f6753aa33414e3e6ec1468bb7657379fc59db0884f9f3f2e5c921382fe9d6fb"} Jan 28 18:54:43 crc kubenswrapper[4749]: I0128 18:54:43.949582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" event={"ID":"9749144e-8c20-470f-97f1-c450d9520c07","Type":"ContainerStarted","Data":"8a16fa08dcec9c38c965a3b663c719612579bdd013bde68f7e516cf877dcfb6c"} Jan 28 18:54:43 crc kubenswrapper[4749]: I0128 18:54:43.951309 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" Jan 28 18:54:43 crc kubenswrapper[4749]: I0128 18:54:43.974647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" event={"ID":"c102c3d5-9654-48bf-be4f-6cb41b1f8d7a","Type":"ContainerStarted","Data":"ff3cd74af07bf06fe624b585301c75e554ec90b6fb1259fe7f8b2bc3393bc348"} Jan 28 18:54:43 crc kubenswrapper[4749]: I0128 18:54:43.975478 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" Jan 28 18:54:43 crc kubenswrapper[4749]: I0128 18:54:43.984129 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds"] Jan 28 18:54:43 crc kubenswrapper[4749]: I0128 18:54:43.986846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" event={"ID":"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3","Type":"ContainerStarted","Data":"8f779818ea521e9748760d5a7bc920e6ad2731633101ed14cc58d2327247b9d3"} Jan 28 18:54:44 crc kubenswrapper[4749]: I0128 18:54:44.003108 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn"] Jan 28 18:54:44 crc kubenswrapper[4749]: I0128 18:54:44.012319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" event={"ID":"aeea4bd2-a2a5-451b-9a40-02881d5901a0","Type":"ContainerStarted","Data":"99b3468ac52a32917c7c0dc5d31d4585f295aea4d0d9167c40c1072980caceaf"} Jan 28 18:54:44 crc kubenswrapper[4749]: I0128 18:54:44.013012 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" Jan 28 18:54:44 crc kubenswrapper[4749]: W0128 18:54:44.016446 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5b52880_267a_4211_a94d_09d132976cf3.slice/crio-8a16203a4b8572a949dbe36262ed0dc458ba47bbda1cc7e7346f80f61bd6bb1f WatchSource:0}: Error finding container 8a16203a4b8572a949dbe36262ed0dc458ba47bbda1cc7e7346f80f61bd6bb1f: Status 404 returned error can't find the container with id 8a16203a4b8572a949dbe36262ed0dc458ba47bbda1cc7e7346f80f61bd6bb1f Jan 28 18:54:44 crc kubenswrapper[4749]: I0128 18:54:44.029704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" event={"ID":"66c3a9b0-5edc-468c-bf0e-309cac8be928","Type":"ContainerStarted","Data":"5509c640ecae86c5fb9e1edaaa24a1b51b21638ec42c122180d8291191fd9dff"} Jan 28 18:54:44 crc kubenswrapper[4749]: E0128 18:54:44.032347 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/nova-operator@sha256:a992613466db3478a00c20c28639c4a12f6326aa52c40a418d1ec40038c83b61\\\"\"" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" podUID="bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b" Jan 28 18:54:44 crc kubenswrapper[4749]: I0128 18:54:44.133316 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" podStartSLOduration=7.476050646 podStartE2EDuration="45.13329428s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.187023474 +0000 UTC m=+1110.198550249" lastFinishedPulling="2026-01-28 18:54:39.844267118 +0000 UTC m=+1147.855793883" observedRunningTime="2026-01-28 18:54:44.115002911 +0000 UTC m=+1152.126529706" watchObservedRunningTime="2026-01-28 18:54:44.13329428 +0000 UTC m=+1152.144821065" Jan 28 18:54:44 crc kubenswrapper[4749]: I0128 18:54:44.280044 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" podStartSLOduration=3.127296879 podStartE2EDuration="45.280023799s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:01.164939362 +0000 UTC m=+1109.176466137" lastFinishedPulling="2026-01-28 18:54:43.317666282 +0000 UTC m=+1151.329193057" observedRunningTime="2026-01-28 18:54:44.276050281 +0000 UTC m=+1152.287577056" watchObservedRunningTime="2026-01-28 18:54:44.280023799 +0000 UTC m=+1152.291550574" Jan 28 18:54:44 crc kubenswrapper[4749]: I0128 18:54:44.281319 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" podStartSLOduration=8.75672041 podStartE2EDuration="45.28131197s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:01.080161183 +0000 UTC m=+1109.091687958" lastFinishedPulling="2026-01-28 18:54:37.604752743 +0000 UTC m=+1145.616279518" observedRunningTime="2026-01-28 18:54:44.194902751 +0000 UTC m=+1152.206429556" watchObservedRunningTime="2026-01-28 18:54:44.28131197 +0000 UTC m=+1152.292838745" Jan 28 18:54:44 crc kubenswrapper[4749]: I0128 18:54:44.482336 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" podStartSLOduration=9.989748046 podStartE2EDuration="45.482303871s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.112146287 +0000 UTC m=+1110.123673062" lastFinishedPulling="2026-01-28 18:54:37.604702112 +0000 UTC m=+1145.616228887" observedRunningTime="2026-01-28 18:54:44.450561072 +0000 UTC m=+1152.462087857" watchObservedRunningTime="2026-01-28 18:54:44.482303871 +0000 UTC m=+1152.493830646" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.038408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" event={"ID":"ef2c3378-a1f3-4366-be22-66e12e71dcb2","Type":"ContainerStarted","Data":"61e5f4c8e9851b28c802b486a321aef200b91ee89274d7642a0eb499ebd03355"} Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.039582 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.041272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" event={"ID":"8e193a26-a0a4-48e1-a3bf-13b52f809e3e","Type":"ContainerStarted","Data":"44157383c5c72cac1c3e5b1cfb3bb38a1b74fe6fd9910ed2dd8f3627ee8d6300"} Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.041489 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.043009 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" event={"ID":"76e8a2e5-b9cd-4114-af08-d767fb2ae45d","Type":"ContainerStarted","Data":"ebc16a089fdebeb9537c0376edcfc6438d39cb6cf3b115555b40afdbd3e6ae75"} Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.044200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" event={"ID":"c5b52880-267a-4211-a94d-09d132976cf3","Type":"ContainerStarted","Data":"8a16203a4b8572a949dbe36262ed0dc458ba47bbda1cc7e7346f80f61bd6bb1f"} Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.045692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" event={"ID":"b606ed73-e992-4755-ac52-4ace6b8b553c","Type":"ContainerStarted","Data":"21f4fa1949ef3ebd744a3554ff1c15e9c9dfd98eee7d5e2fac87f37f2f53e556"} Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.045857 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.047710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" event={"ID":"8780976f-c471-4ba4-956c-33ea9089437f","Type":"ContainerStarted","Data":"5fcf388dd65464878f37304018d9c57a30efcc06e1b323e9a3a64bd62239fbd7"} Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.048064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.048472 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.064763 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" podStartSLOduration=4.744714635 podStartE2EDuration="46.064744028s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.071684945 +0000 UTC m=+1110.083211720" lastFinishedPulling="2026-01-28 18:54:43.391714338 +0000 UTC m=+1151.403241113" observedRunningTime="2026-01-28 18:54:45.060829852 +0000 UTC m=+1153.072356657" watchObservedRunningTime="2026-01-28 18:54:45.064744028 +0000 UTC m=+1153.076270813" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.088158 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" podStartSLOduration=3.299045163 podStartE2EDuration="46.088135362s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:00.580508226 +0000 UTC m=+1108.592035001" lastFinishedPulling="2026-01-28 18:54:43.369598435 +0000 UTC m=+1151.381125200" observedRunningTime="2026-01-28 18:54:45.080429522 +0000 UTC m=+1153.091956327" watchObservedRunningTime="2026-01-28 18:54:45.088135362 +0000 UTC m=+1153.099662137" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.104891 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" podStartSLOduration=5.240502067 podStartE2EDuration="47.104870943s" podCreationTimestamp="2026-01-28 18:53:58 +0000 UTC" firstStartedPulling="2026-01-28 18:54:01.452179558 +0000 UTC m=+1109.463706333" lastFinishedPulling="2026-01-28 18:54:43.316548434 +0000 UTC m=+1151.328075209" observedRunningTime="2026-01-28 18:54:45.097765428 +0000 UTC m=+1153.109292223" watchObservedRunningTime="2026-01-28 18:54:45.104870943 +0000 UTC m=+1153.116397718" Jan 28 18:54:45 crc kubenswrapper[4749]: I0128 18:54:45.898604 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" podStartSLOduration=5.422576583 podStartE2EDuration="46.898586323s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:01.88843563 +0000 UTC m=+1109.899962405" lastFinishedPulling="2026-01-28 18:54:43.36444537 +0000 UTC m=+1151.375972145" observedRunningTime="2026-01-28 18:54:45.128282626 +0000 UTC m=+1153.139809411" watchObservedRunningTime="2026-01-28 18:54:45.898586323 +0000 UTC m=+1153.910113088" Jan 28 18:54:47 crc kubenswrapper[4749]: I0128 18:54:47.062512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" event={"ID":"5ec9e008-61f0-4635-8e4c-2ae6fe098fc3","Type":"ContainerStarted","Data":"ce70c7ea65f946e5784e71e71ddeaccb950667a296c2810e7d235c30c3a35235"} Jan 28 18:54:47 crc kubenswrapper[4749]: I0128 18:54:47.064016 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:47 crc kubenswrapper[4749]: I0128 18:54:47.090298 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" podStartSLOduration=48.090278435 podStartE2EDuration="48.090278435s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:54:47.084520944 +0000 UTC m=+1155.096047729" watchObservedRunningTime="2026-01-28 18:54:47.090278435 +0000 UTC m=+1155.101805210" Jan 28 18:54:49 crc kubenswrapper[4749]: I0128 18:54:49.087290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" event={"ID":"b52dfbde-2120-4054-974f-5992f89c9811","Type":"ContainerStarted","Data":"4b4b20e87ba25c19509a5cde9b188f6e6a1db07d631873ddb3affb66da97e549"} Jan 28 18:54:49 crc kubenswrapper[4749]: I0128 18:54:49.088693 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" Jan 28 18:54:49 crc kubenswrapper[4749]: I0128 18:54:49.132312 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" podStartSLOduration=3.675937927 podStartE2EDuration="50.132289886s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.076020271 +0000 UTC m=+1110.087547046" lastFinishedPulling="2026-01-28 18:54:48.53237223 +0000 UTC m=+1156.543899005" observedRunningTime="2026-01-28 18:54:49.130651326 +0000 UTC m=+1157.142178111" watchObservedRunningTime="2026-01-28 18:54:49.132289886 +0000 UTC m=+1157.143816671" Jan 28 18:54:49 crc kubenswrapper[4749]: I0128 18:54:49.417220 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-q7r57" Jan 28 18:54:49 crc kubenswrapper[4749]: I0128 18:54:49.492429 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-g9cw9" Jan 28 18:54:49 crc kubenswrapper[4749]: I0128 18:54:49.573737 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-958664b5-2x98l" Jan 28 18:54:49 crc kubenswrapper[4749]: I0128 18:54:49.639940 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-7rcdb" Jan 28 18:54:49 crc kubenswrapper[4749]: I0128 18:54:49.901754 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-npbds" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.032839 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.132531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" event={"ID":"b99425c5-bab7-48bb-a341-e1c160eac631","Type":"ContainerStarted","Data":"d10b2bce067b9e6a6a7cc388dad6dec35fe36a817fafa92449bb10a8df1bba35"} Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.132841 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.143549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" event={"ID":"ce60ace7-ac44-45a5-9422-dade4b147417","Type":"ContainerStarted","Data":"b94fccbae665884901451d8e959e7c7c202b0ee6ec65ecdd7c21127f8caca9dd"} Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.144502 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.145490 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x28dk" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.148698 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5szzt" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.162599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" event={"ID":"cd842919-01b8-4886-914e-2eb9e731f65f","Type":"ContainerStarted","Data":"390c43d64b7b4e473397ae0202965026c1ef1669625bfdaeb3145123f63a6da6"} Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.163692 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.181517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" event={"ID":"4c6ce987-6e7a-4c86-9269-115e16d51c4d","Type":"ContainerStarted","Data":"c8e9df4131b6bb3ef6fbdb3e3e029aac2f122c2acc92425c837d0bcfc86d69fe"} Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.182543 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.186013 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" podStartSLOduration=2.604499635 podStartE2EDuration="51.185990724s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:00.591606388 +0000 UTC m=+1108.603133163" lastFinishedPulling="2026-01-28 18:54:49.173097477 +0000 UTC m=+1157.184624252" observedRunningTime="2026-01-28 18:54:50.178126261 +0000 UTC m=+1158.189653046" watchObservedRunningTime="2026-01-28 18:54:50.185990724 +0000 UTC m=+1158.197517499" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.248678 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" podStartSLOduration=4.883904278 podStartE2EDuration="51.24865288s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.168289944 +0000 UTC m=+1110.179816719" lastFinishedPulling="2026-01-28 18:54:48.533038546 +0000 UTC m=+1156.544565321" observedRunningTime="2026-01-28 18:54:50.210049324 +0000 UTC m=+1158.221576109" watchObservedRunningTime="2026-01-28 18:54:50.24865288 +0000 UTC m=+1158.260179675" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.289454 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" podStartSLOduration=3.404438167 podStartE2EDuration="51.289437661s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:01.289136459 +0000 UTC m=+1109.300663234" lastFinishedPulling="2026-01-28 18:54:49.174135963 +0000 UTC m=+1157.185662728" observedRunningTime="2026-01-28 18:54:50.287032762 +0000 UTC m=+1158.298559537" watchObservedRunningTime="2026-01-28 18:54:50.289437661 +0000 UTC m=+1158.300964436" Jan 28 18:54:50 crc kubenswrapper[4749]: I0128 18:54:50.426291 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" podStartSLOduration=5.371376255 podStartE2EDuration="51.426269947s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.545762223 +0000 UTC m=+1110.557288998" lastFinishedPulling="2026-01-28 18:54:48.600655915 +0000 UTC m=+1156.612182690" observedRunningTime="2026-01-28 18:54:50.378180728 +0000 UTC m=+1158.389707523" watchObservedRunningTime="2026-01-28 18:54:50.426269947 +0000 UTC m=+1158.437796722" Jan 28 18:54:52 crc kubenswrapper[4749]: I0128 18:54:52.282854 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-798d8549d8-9kb8p" Jan 28 18:54:54 crc kubenswrapper[4749]: I0128 18:54:54.219511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" event={"ID":"76e8a2e5-b9cd-4114-af08-d767fb2ae45d","Type":"ContainerStarted","Data":"98c37106d3f336abcf2a42ede652f5e3d4c160aa41fcb9b2cd055502d513a436"} Jan 28 18:54:54 crc kubenswrapper[4749]: I0128 18:54:54.220002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:54:54 crc kubenswrapper[4749]: I0128 18:54:54.220769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" event={"ID":"c5b52880-267a-4211-a94d-09d132976cf3","Type":"ContainerStarted","Data":"8102b71ca3507317fbaf93da776a68d1c928e8d61c20ce3f8b0e48f32ae5687e"} Jan 28 18:54:54 crc kubenswrapper[4749]: I0128 18:54:54.221266 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:54:54 crc kubenswrapper[4749]: I0128 18:54:54.234043 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" podStartSLOduration=45.306719594 podStartE2EDuration="55.234026123s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:44.0118263 +0000 UTC m=+1152.023353075" lastFinishedPulling="2026-01-28 18:54:53.939132829 +0000 UTC m=+1161.950659604" observedRunningTime="2026-01-28 18:54:54.231561603 +0000 UTC m=+1162.243088378" watchObservedRunningTime="2026-01-28 18:54:54.234026123 +0000 UTC m=+1162.245552898" Jan 28 18:54:54 crc kubenswrapper[4749]: I0128 18:54:54.254359 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" podStartSLOduration=45.325804652 podStartE2EDuration="55.254316761s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:44.027470763 +0000 UTC m=+1152.038997538" lastFinishedPulling="2026-01-28 18:54:53.955982872 +0000 UTC m=+1161.967509647" observedRunningTime="2026-01-28 18:54:54.252739752 +0000 UTC m=+1162.264266537" watchObservedRunningTime="2026-01-28 18:54:54.254316761 +0000 UTC m=+1162.265843566" Jan 28 18:54:55 crc kubenswrapper[4749]: I0128 18:54:55.229933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" event={"ID":"b553f796-28d7-455e-92bb-05bbc30a9a27","Type":"ContainerStarted","Data":"55a79312163399f4426277c9f35f649ceae9d9b1e55eac2b5902a7fa1910da25"} Jan 28 18:54:55 crc kubenswrapper[4749]: I0128 18:54:55.230189 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" Jan 28 18:54:55 crc kubenswrapper[4749]: I0128 18:54:55.231800 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" event={"ID":"b3ba0ea4-04ca-478e-ae2c-ebcbf3353362","Type":"ContainerStarted","Data":"2a6e66f6fd27b43bf7360e74416253e7e36fa8baf24f0ec4ba0c0aee1a19300b"} Jan 28 18:54:55 crc kubenswrapper[4749]: I0128 18:54:55.232369 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" Jan 28 18:54:55 crc kubenswrapper[4749]: I0128 18:54:55.250916 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" podStartSLOduration=4.095964301 podStartE2EDuration="56.250901257s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.168039169 +0000 UTC m=+1110.179565944" lastFinishedPulling="2026-01-28 18:54:54.322976115 +0000 UTC m=+1162.334502900" observedRunningTime="2026-01-28 18:54:55.250051147 +0000 UTC m=+1163.261577922" watchObservedRunningTime="2026-01-28 18:54:55.250901257 +0000 UTC m=+1163.262428032" Jan 28 18:54:55 crc kubenswrapper[4749]: I0128 18:54:55.281489 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" podStartSLOduration=4.370056113 podStartE2EDuration="56.281471587s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.567575589 +0000 UTC m=+1110.579102364" lastFinishedPulling="2026-01-28 18:54:54.478991063 +0000 UTC m=+1162.490517838" observedRunningTime="2026-01-28 18:54:55.273331348 +0000 UTC m=+1163.284858133" watchObservedRunningTime="2026-01-28 18:54:55.281471587 +0000 UTC m=+1163.292998362" Jan 28 18:54:56 crc kubenswrapper[4749]: I0128 18:54:56.240474 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" event={"ID":"a607ca6c-b867-42b6-b9ad-d5941671b685","Type":"ContainerStarted","Data":"8757d31da897d3b1af52fda561fd056309b007aa365ffbac8e8fa36720c8749b"} Jan 28 18:54:56 crc kubenswrapper[4749]: I0128 18:54:56.240967 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" Jan 28 18:54:56 crc kubenswrapper[4749]: I0128 18:54:56.254397 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" podStartSLOduration=4.280135709 podStartE2EDuration="57.254377243s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.326752131 +0000 UTC m=+1110.338278907" lastFinishedPulling="2026-01-28 18:54:55.300993666 +0000 UTC m=+1163.312520441" observedRunningTime="2026-01-28 18:54:56.253976404 +0000 UTC m=+1164.265503169" watchObservedRunningTime="2026-01-28 18:54:56.254377243 +0000 UTC m=+1164.265904018" Jan 28 18:54:57 crc kubenswrapper[4749]: E0128 18:54:57.719308 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:774b657c4a2d169eb939c51d71a146bf4a44e93b\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" podUID="87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c" Jan 28 18:54:58 crc kubenswrapper[4749]: I0128 18:54:58.760960 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" event={"ID":"d08f8b17-02c1-4000-a5a5-0e53b209472c","Type":"ContainerStarted","Data":"3a416918b83001f432b51eb49fca39f65859842b72e4077f84541b6889f182ab"} Jan 28 18:54:58 crc kubenswrapper[4749]: I0128 18:54:58.761832 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" Jan 28 18:54:58 crc kubenswrapper[4749]: I0128 18:54:58.763589 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" event={"ID":"bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b","Type":"ContainerStarted","Data":"7e0a0c15cb1423767de661c3f91c83d57f1e9e14ca3c8f15f182fb9e03cc2e51"} Jan 28 18:54:58 crc kubenswrapper[4749]: I0128 18:54:58.763900 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" Jan 28 18:54:58 crc kubenswrapper[4749]: I0128 18:54:58.765885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" event={"ID":"0f56d027-e393-4583-ab10-8dc8c1046027","Type":"ContainerStarted","Data":"570328f7697639fdfc7f61a7e975e3249461846286ae14e964b9ed627a558644"} Jan 28 18:54:58 crc kubenswrapper[4749]: I0128 18:54:58.787450 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" podStartSLOduration=3.935517575 podStartE2EDuration="59.78742973s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.338813288 +0000 UTC m=+1110.350340063" lastFinishedPulling="2026-01-28 18:54:58.190725433 +0000 UTC m=+1166.202252218" observedRunningTime="2026-01-28 18:54:58.781786891 +0000 UTC m=+1166.793313676" watchObservedRunningTime="2026-01-28 18:54:58.78742973 +0000 UTC m=+1166.798956495" Jan 28 18:54:58 crc kubenswrapper[4749]: I0128 18:54:58.821548 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ssdhs" podStartSLOduration=4.267363504 podStartE2EDuration="59.821525266s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.343987424 +0000 UTC m=+1110.355514199" lastFinishedPulling="2026-01-28 18:54:57.898149176 +0000 UTC m=+1165.909675961" observedRunningTime="2026-01-28 18:54:58.818150183 +0000 UTC m=+1166.829676968" watchObservedRunningTime="2026-01-28 18:54:58.821525266 +0000 UTC m=+1166.833052041" Jan 28 18:54:58 crc kubenswrapper[4749]: I0128 18:54:58.823855 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" podStartSLOduration=4.265276464 podStartE2EDuration="59.823846213s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.3380989 +0000 UTC m=+1110.349625675" lastFinishedPulling="2026-01-28 18:54:57.896668649 +0000 UTC m=+1165.908195424" observedRunningTime="2026-01-28 18:54:58.805555145 +0000 UTC m=+1166.817081920" watchObservedRunningTime="2026-01-28 18:54:58.823846213 +0000 UTC m=+1166.835372988" Jan 28 18:54:59 crc kubenswrapper[4749]: I0128 18:54:59.415498 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-n667g" Jan 28 18:54:59 crc kubenswrapper[4749]: I0128 18:54:59.505252 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-zbk86" Jan 28 18:54:59 crc kubenswrapper[4749]: I0128 18:54:59.673857 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-5ztsh" Jan 28 18:54:59 crc kubenswrapper[4749]: I0128 18:54:59.894235 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-765668569f-p2dr2" Jan 28 18:54:59 crc kubenswrapper[4749]: I0128 18:54:59.940140 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-cl2lx" Jan 28 18:54:59 crc kubenswrapper[4749]: I0128 18:54:59.992032 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-hgs42" Jan 28 18:55:00 crc kubenswrapper[4749]: I0128 18:55:00.068643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pqzkn" Jan 28 18:55:00 crc kubenswrapper[4749]: I0128 18:55:00.454656 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-8kcjp" Jan 28 18:55:01 crc kubenswrapper[4749]: I0128 18:55:01.351395 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4t7fn" Jan 28 18:55:01 crc kubenswrapper[4749]: I0128 18:55:01.832549 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d6clds" Jan 28 18:55:09 crc kubenswrapper[4749]: I0128 18:55:09.763675 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-mgpfr" Jan 28 18:55:09 crc kubenswrapper[4749]: I0128 18:55:09.956886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-kq24f" Jan 28 18:55:11 crc kubenswrapper[4749]: I0128 18:55:11.868088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" event={"ID":"87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c","Type":"ContainerStarted","Data":"431e7ce74c5c3d892b11edaadc6dabeabe66615e8bed3727f48ae8263ffb3aa7"} Jan 28 18:55:11 crc kubenswrapper[4749]: I0128 18:55:11.868620 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" Jan 28 18:55:11 crc kubenswrapper[4749]: I0128 18:55:11.887028 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" podStartSLOduration=4.510367941 podStartE2EDuration="1m12.887009156s" podCreationTimestamp="2026-01-28 18:53:59 +0000 UTC" firstStartedPulling="2026-01-28 18:54:02.570120342 +0000 UTC m=+1110.581647117" lastFinishedPulling="2026-01-28 18:55:10.946761547 +0000 UTC m=+1178.958288332" observedRunningTime="2026-01-28 18:55:11.880649518 +0000 UTC m=+1179.892176303" watchObservedRunningTime="2026-01-28 18:55:11.887009156 +0000 UTC m=+1179.898535931" Jan 28 18:55:20 crc kubenswrapper[4749]: I0128 18:55:20.097637 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-877d65859-6skzm" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.233029 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qzkpr"] Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.235146 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.240957 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-j2p99" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.241373 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.241675 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.241932 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.244029 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qzkpr"] Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.293949 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6kmfq"] Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.295561 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.300275 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.305055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6kmfq"] Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.345578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699d2959-eac0-4edd-a8c7-cd893f426b40-config\") pod \"dnsmasq-dns-675f4bcbfc-qzkpr\" (UID: \"699d2959-eac0-4edd-a8c7-cd893f426b40\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.345673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49kqf\" (UniqueName: \"kubernetes.io/projected/699d2959-eac0-4edd-a8c7-cd893f426b40-kube-api-access-49kqf\") pod \"dnsmasq-dns-675f4bcbfc-qzkpr\" (UID: \"699d2959-eac0-4edd-a8c7-cd893f426b40\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.447109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699d2959-eac0-4edd-a8c7-cd893f426b40-config\") pod \"dnsmasq-dns-675f4bcbfc-qzkpr\" (UID: \"699d2959-eac0-4edd-a8c7-cd893f426b40\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.447179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qjnk\" (UniqueName: \"kubernetes.io/projected/87ffdafe-209d-4c00-a039-5353a513db9d-kube-api-access-8qjnk\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.447239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-config\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.447263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49kqf\" (UniqueName: \"kubernetes.io/projected/699d2959-eac0-4edd-a8c7-cd893f426b40-kube-api-access-49kqf\") pod \"dnsmasq-dns-675f4bcbfc-qzkpr\" (UID: \"699d2959-eac0-4edd-a8c7-cd893f426b40\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.447283 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.448228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699d2959-eac0-4edd-a8c7-cd893f426b40-config\") pod \"dnsmasq-dns-675f4bcbfc-qzkpr\" (UID: \"699d2959-eac0-4edd-a8c7-cd893f426b40\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.467622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49kqf\" (UniqueName: \"kubernetes.io/projected/699d2959-eac0-4edd-a8c7-cd893f426b40-kube-api-access-49kqf\") pod \"dnsmasq-dns-675f4bcbfc-qzkpr\" (UID: \"699d2959-eac0-4edd-a8c7-cd893f426b40\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.548622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qjnk\" (UniqueName: \"kubernetes.io/projected/87ffdafe-209d-4c00-a039-5353a513db9d-kube-api-access-8qjnk\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.548688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-config\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.548720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.549538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.549617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-config\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.560013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.578238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qjnk\" (UniqueName: \"kubernetes.io/projected/87ffdafe-209d-4c00-a039-5353a513db9d-kube-api-access-8qjnk\") pod \"dnsmasq-dns-78dd6ddcc-6kmfq\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:35 crc kubenswrapper[4749]: I0128 18:55:35.618823 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:36 crc kubenswrapper[4749]: I0128 18:55:36.005285 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qzkpr"] Jan 28 18:55:36 crc kubenswrapper[4749]: I0128 18:55:36.051706 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" event={"ID":"699d2959-eac0-4edd-a8c7-cd893f426b40","Type":"ContainerStarted","Data":"1f794555b62c19e4cbecf8505903cc535a968e66dbab2a9fddad461c03f6c96a"} Jan 28 18:55:36 crc kubenswrapper[4749]: I0128 18:55:36.141013 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6kmfq"] Jan 28 18:55:36 crc kubenswrapper[4749]: W0128 18:55:36.144175 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87ffdafe_209d_4c00_a039_5353a513db9d.slice/crio-31b3abd2151d00e4aaa12223afddbc6b23b31b76a8508d98eef71eab32b5e13d WatchSource:0}: Error finding container 31b3abd2151d00e4aaa12223afddbc6b23b31b76a8508d98eef71eab32b5e13d: Status 404 returned error can't find the container with id 31b3abd2151d00e4aaa12223afddbc6b23b31b76a8508d98eef71eab32b5e13d Jan 28 18:55:37 crc kubenswrapper[4749]: I0128 18:55:37.065210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" event={"ID":"87ffdafe-209d-4c00-a039-5353a513db9d","Type":"ContainerStarted","Data":"31b3abd2151d00e4aaa12223afddbc6b23b31b76a8508d98eef71eab32b5e13d"} Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.024494 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qzkpr"] Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.064994 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mn7sc"] Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.066889 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.078673 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mn7sc"] Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.208805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-config\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.208861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7npc7\" (UniqueName: \"kubernetes.io/projected/4d1004cb-d7e0-4f3a-9739-33d4afdec629-kube-api-access-7npc7\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.208938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.311728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-config\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.311898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7npc7\" (UniqueName: \"kubernetes.io/projected/4d1004cb-d7e0-4f3a-9739-33d4afdec629-kube-api-access-7npc7\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.311988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.313000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-config\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.314468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.366080 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7npc7\" (UniqueName: \"kubernetes.io/projected/4d1004cb-d7e0-4f3a-9739-33d4afdec629-kube-api-access-7npc7\") pod \"dnsmasq-dns-666b6646f7-mn7sc\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.401622 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.450792 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6kmfq"] Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.488369 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wgmbm"] Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.489731 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.532939 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wgmbm"] Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.621511 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld5zh\" (UniqueName: \"kubernetes.io/projected/8b3ee51e-214c-4efe-875b-51db3beea94c-kube-api-access-ld5zh\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.621679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.621802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-config\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.724282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-config\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.726004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld5zh\" (UniqueName: \"kubernetes.io/projected/8b3ee51e-214c-4efe-875b-51db3beea94c-kube-api-access-ld5zh\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.726087 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.725770 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-config\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.728703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.747144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld5zh\" (UniqueName: \"kubernetes.io/projected/8b3ee51e-214c-4efe-875b-51db3beea94c-kube-api-access-ld5zh\") pod \"dnsmasq-dns-57d769cc4f-wgmbm\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:38 crc kubenswrapper[4749]: I0128 18:55:38.862004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.230055 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.232976 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.235272 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xj4k4" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.237512 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.237733 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.237857 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.237951 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.238094 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.238274 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.241337 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.254839 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.256526 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.271398 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.273298 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.279998 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.296512 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.336825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-config-data\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.336873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.336899 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.336915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337040 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1842f17a-bc44-4eec-81bf-fc94b63362c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1842f17a-bc44-4eec-81bf-fc94b63362c9\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337113 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19a12543-c0a3-486f-b5bd-4f2862c15a37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337207 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-server-conf\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqcv\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-kube-api-access-xjqcv\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-config-data\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94850daa-65af-4e6a-ad29-cfa28c3076e7-pod-info\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19a12543-c0a3-486f-b5bd-4f2862c15a37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337581 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9xd\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-kube-api-access-ll9xd\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fa673e89-5fa0-4b37-809b-384a7c067e9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa673e89-5fa0-4b37-809b-384a7c067e9c\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.337637 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94850daa-65af-4e6a-ad29-cfa28c3076e7-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19a12543-c0a3-486f-b5bd-4f2862c15a37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439296 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9xd\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-kube-api-access-ll9xd\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439314 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fa673e89-5fa0-4b37-809b-384a7c067e9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa673e89-5fa0-4b37-809b-384a7c067e9c\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94850daa-65af-4e6a-ad29-cfa28c3076e7-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-config-data\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439799 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdhg\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-kube-api-access-5rdhg\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.439980 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.440009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-server-conf\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.440036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1842f17a-bc44-4eec-81bf-fc94b63362c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1842f17a-bc44-4eec-81bf-fc94b63362c9\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.440085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-config-data\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.440793 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441463 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f7614ab4-92b2-4909-9f4f-e9608bc0d74e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7614ab4-92b2-4909-9f4f-e9608bc0d74e\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/772600b2-9086-4d72-bb86-6edfb0a21b35-pod-info\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19a12543-c0a3-486f-b5bd-4f2862c15a37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-server-conf\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqcv\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-kube-api-access-xjqcv\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-config-data\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.441806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.443086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.443130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.443161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94850daa-65af-4e6a-ad29-cfa28c3076e7-pod-info\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.443181 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/772600b2-9086-4d72-bb86-6edfb0a21b35-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.443764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.444094 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.444122 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1842f17a-bc44-4eec-81bf-fc94b63362c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1842f17a-bc44-4eec-81bf-fc94b63362c9\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c48bcdd45bb4b8e8cd135d1d3cfd19b3a72b6b291775ad971e53118279ae65bc/globalmount\"" pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.444241 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.444264 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fa673e89-5fa0-4b37-809b-384a7c067e9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa673e89-5fa0-4b37-809b-384a7c067e9c\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4035aa00586cd4726ab4e7b65841275877986337193c92c62a02a89dfc4b09d6/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.444499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.445137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.445419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-config-data\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.446085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.446904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.447493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.448081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a12543-c0a3-486f-b5bd-4f2862c15a37-config-data\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.448183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.448460 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94850daa-65af-4e6a-ad29-cfa28c3076e7-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.448536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94850daa-65af-4e6a-ad29-cfa28c3076e7-server-conf\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.450789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19a12543-c0a3-486f-b5bd-4f2862c15a37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.450980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94850daa-65af-4e6a-ad29-cfa28c3076e7-pod-info\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.451385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19a12543-c0a3-486f-b5bd-4f2862c15a37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.456035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.458813 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqcv\" (UniqueName: \"kubernetes.io/projected/19a12543-c0a3-486f-b5bd-4f2862c15a37-kube-api-access-xjqcv\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.459135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9xd\" (UniqueName: \"kubernetes.io/projected/94850daa-65af-4e6a-ad29-cfa28c3076e7-kube-api-access-ll9xd\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.483811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1842f17a-bc44-4eec-81bf-fc94b63362c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1842f17a-bc44-4eec-81bf-fc94b63362c9\") pod \"rabbitmq-server-1\" (UID: \"94850daa-65af-4e6a-ad29-cfa28c3076e7\") " pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.485611 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fa673e89-5fa0-4b37-809b-384a7c067e9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa673e89-5fa0-4b37-809b-384a7c067e9c\") pod \"rabbitmq-server-0\" (UID: \"19a12543-c0a3-486f-b5bd-4f2862c15a37\") " pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553369 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/772600b2-9086-4d72-bb86-6edfb0a21b35-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdhg\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-kube-api-access-5rdhg\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-server-conf\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-config-data\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553628 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f7614ab4-92b2-4909-9f4f-e9608bc0d74e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7614ab4-92b2-4909-9f4f-e9608bc0d74e\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/772600b2-9086-4d72-bb86-6edfb0a21b35-pod-info\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.553681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.554154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.557889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.563218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-config-data\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.564449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.569922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/772600b2-9086-4d72-bb86-6edfb0a21b35-server-conf\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.570934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.575110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/772600b2-9086-4d72-bb86-6edfb0a21b35-pod-info\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.576493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/772600b2-9086-4d72-bb86-6edfb0a21b35-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.582555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.583506 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.583562 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f7614ab4-92b2-4909-9f4f-e9608bc0d74e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7614ab4-92b2-4909-9f4f-e9608bc0d74e\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6ab93d2962f7a278a5d3ef3f17c19e017c8c19d475c07dc87711def1c710f3ac/globalmount\"" pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.586204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdhg\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-kube-api-access-5rdhg\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.598351 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.602460 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/772600b2-9086-4d72-bb86-6edfb0a21b35-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.677133 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.682402 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.687679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.690026 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.690287 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.690356 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.690538 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.690584 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-csm5j" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.690715 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.691174 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.707845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f7614ab4-92b2-4909-9f4f-e9608bc0d74e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7614ab4-92b2-4909-9f4f-e9608bc0d74e\") pod \"rabbitmq-server-2\" (UID: \"772600b2-9086-4d72-bb86-6edfb0a21b35\") " pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757035 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5954ab85-e42a-498a-ae91-fd46445c0860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5954ab85-e42a-498a-ae91-fd46445c0860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkscm\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-kube-api-access-nkscm\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.757349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d5ab9b04-c6f9-4ab4-bb7e-fc50ce782511\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5ab9b04-c6f9-4ab4-bb7e-fc50ce782511\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5954ab85-e42a-498a-ae91-fd46445c0860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5954ab85-e42a-498a-ae91-fd46445c0860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkscm\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-kube-api-access-nkscm\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d5ab9b04-c6f9-4ab4-bb7e-fc50ce782511\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5ab9b04-c6f9-4ab4-bb7e-fc50ce782511\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.861609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.862364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.862886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.863626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.865816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.866296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.866628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5954ab85-e42a-498a-ae91-fd46445c0860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.867939 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.868114 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d5ab9b04-c6f9-4ab4-bb7e-fc50ce782511\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5ab9b04-c6f9-4ab4-bb7e-fc50ce782511\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3649ec8ca6cfb1616a7f6958f49a5941ffe47bc4688f083252bca9290071ac97/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.870796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5954ab85-e42a-498a-ae91-fd46445c0860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.871530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.872850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5954ab85-e42a-498a-ae91-fd46445c0860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.882180 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkscm\" (UniqueName: \"kubernetes.io/projected/5954ab85-e42a-498a-ae91-fd46445c0860-kube-api-access-nkscm\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.905396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 28 18:55:39 crc kubenswrapper[4749]: I0128 18:55:39.920100 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d5ab9b04-c6f9-4ab4-bb7e-fc50ce782511\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5ab9b04-c6f9-4ab4-bb7e-fc50ce782511\") pod \"rabbitmq-cell1-server-0\" (UID: \"5954ab85-e42a-498a-ae91-fd46445c0860\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.022491 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.860774 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.866429 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.869508 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.871921 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.875510 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-d24ft" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.876272 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.876450 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.882489 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.887519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c01d945-dccd-468e-b6ce-d269f1715462-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.887573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.887612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-kolla-config\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.887632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9t72\" (UniqueName: \"kubernetes.io/projected/6c01d945-dccd-468e-b6ce-d269f1715462-kube-api-access-s9t72\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.887667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c01d945-dccd-468e-b6ce-d269f1715462-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.887717 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c01d945-dccd-468e-b6ce-d269f1715462-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.887748 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-config-data-default\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.887774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29042cee-3b77-419f-a9fc-31649487c2e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29042cee-3b77-419f-a9fc-31649487c2e6\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.990451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.990521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-kolla-config\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.990547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9t72\" (UniqueName: \"kubernetes.io/projected/6c01d945-dccd-468e-b6ce-d269f1715462-kube-api-access-s9t72\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.990602 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c01d945-dccd-468e-b6ce-d269f1715462-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.990681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c01d945-dccd-468e-b6ce-d269f1715462-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.990723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-config-data-default\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.990751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29042cee-3b77-419f-a9fc-31649487c2e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29042cee-3b77-419f-a9fc-31649487c2e6\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.990794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c01d945-dccd-468e-b6ce-d269f1715462-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.991169 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c01d945-dccd-468e-b6ce-d269f1715462-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.992435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:40 crc kubenswrapper[4749]: I0128 18:55:40.992823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-kolla-config\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:41 crc kubenswrapper[4749]: I0128 18:55:41.003756 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:41 crc kubenswrapper[4749]: I0128 18:55:41.003793 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29042cee-3b77-419f-a9fc-31649487c2e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29042cee-3b77-419f-a9fc-31649487c2e6\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a30446a02679099beb5e3f71e1c1bc69d6faa1145c3ac519e4640884eb297b73/globalmount\"" pod="openstack/openstack-galera-0" Jan 28 18:55:41 crc kubenswrapper[4749]: I0128 18:55:41.003909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c01d945-dccd-468e-b6ce-d269f1715462-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:41 crc kubenswrapper[4749]: I0128 18:55:41.006823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c01d945-dccd-468e-b6ce-d269f1715462-config-data-default\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:41 crc kubenswrapper[4749]: I0128 18:55:41.013062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c01d945-dccd-468e-b6ce-d269f1715462-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:41 crc kubenswrapper[4749]: I0128 18:55:41.013600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9t72\" (UniqueName: \"kubernetes.io/projected/6c01d945-dccd-468e-b6ce-d269f1715462-kube-api-access-s9t72\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:41 crc kubenswrapper[4749]: I0128 18:55:41.049026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29042cee-3b77-419f-a9fc-31649487c2e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29042cee-3b77-419f-a9fc-31649487c2e6\") pod \"openstack-galera-0\" (UID: \"6c01d945-dccd-468e-b6ce-d269f1715462\") " pod="openstack/openstack-galera-0" Jan 28 18:55:41 crc kubenswrapper[4749]: I0128 18:55:41.203784 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.118177 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.119955 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.124009 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-klw6b" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.124344 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.124378 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.131944 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.132109 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.166242 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.167963 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.171752 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.171954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.174695 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nrct6" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.205547 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.318512 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t85t\" (UniqueName: \"kubernetes.io/projected/1c85419b-39a8-4c29-bff0-475dfc988a32-kube-api-access-2t85t\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.318658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.318740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c85419b-39a8-4c29-bff0-475dfc988a32-kolla-config\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.318816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.318869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtml6\" (UniqueName: \"kubernetes.io/projected/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-kube-api-access-qtml6\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.318918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.319030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c85419b-39a8-4c29-bff0-475dfc988a32-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.319145 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.319186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c85419b-39a8-4c29-bff0-475dfc988a32-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.319234 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.319250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.319318 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c85419b-39a8-4c29-bff0-475dfc988a32-config-data\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.319395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-72862411-6079-4738-9f62-655cf5653301\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72862411-6079-4738-9f62-655cf5653301\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.422143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.422202 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c85419b-39a8-4c29-bff0-475dfc988a32-kolla-config\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.422249 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.422279 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtml6\" (UniqueName: \"kubernetes.io/projected/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-kube-api-access-qtml6\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.422307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.423085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.423200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c85419b-39a8-4c29-bff0-475dfc988a32-kolla-config\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.423261 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c85419b-39a8-4c29-bff0-475dfc988a32-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.424050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.424145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.424180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c85419b-39a8-4c29-bff0-475dfc988a32-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.424221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.424235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.424291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c85419b-39a8-4c29-bff0-475dfc988a32-config-data\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.424314 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-72862411-6079-4738-9f62-655cf5653301\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72862411-6079-4738-9f62-655cf5653301\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.424376 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t85t\" (UniqueName: \"kubernetes.io/projected/1c85419b-39a8-4c29-bff0-475dfc988a32-kube-api-access-2t85t\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.425365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.426176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c85419b-39a8-4c29-bff0-475dfc988a32-config-data\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.426446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.432044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c85419b-39a8-4c29-bff0-475dfc988a32-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.435903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c85419b-39a8-4c29-bff0-475dfc988a32-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.437510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.437640 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.437670 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-72862411-6079-4738-9f62-655cf5653301\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72862411-6079-4738-9f62-655cf5653301\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8b50f5717b12e8917732784447cd088241caa293ed65e884f1eab2c0d79d0bf3/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.438123 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.439382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtml6\" (UniqueName: \"kubernetes.io/projected/7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e-kube-api-access-qtml6\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.442222 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t85t\" (UniqueName: \"kubernetes.io/projected/1c85419b-39a8-4c29-bff0-475dfc988a32-kube-api-access-2t85t\") pod \"memcached-0\" (UID: \"1c85419b-39a8-4c29-bff0-475dfc988a32\") " pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.475851 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-72862411-6079-4738-9f62-655cf5653301\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72862411-6079-4738-9f62-655cf5653301\") pod \"openstack-cell1-galera-0\" (UID: \"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e\") " pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.503246 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 28 18:55:42 crc kubenswrapper[4749]: I0128 18:55:42.742050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 28 18:55:44 crc kubenswrapper[4749]: I0128 18:55:44.169507 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 18:55:44 crc kubenswrapper[4749]: I0128 18:55:44.171064 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 18:55:44 crc kubenswrapper[4749]: I0128 18:55:44.172973 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-42c9p" Jan 28 18:55:44 crc kubenswrapper[4749]: I0128 18:55:44.185045 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 18:55:44 crc kubenswrapper[4749]: I0128 18:55:44.365150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj2bx\" (UniqueName: \"kubernetes.io/projected/c2a507d2-d263-4695-96e3-4f7af4761450-kube-api-access-jj2bx\") pod \"kube-state-metrics-0\" (UID: \"c2a507d2-d263-4695-96e3-4f7af4761450\") " pod="openstack/kube-state-metrics-0" Jan 28 18:55:44 crc kubenswrapper[4749]: I0128 18:55:44.466742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj2bx\" (UniqueName: \"kubernetes.io/projected/c2a507d2-d263-4695-96e3-4f7af4761450-kube-api-access-jj2bx\") pod \"kube-state-metrics-0\" (UID: \"c2a507d2-d263-4695-96e3-4f7af4761450\") " pod="openstack/kube-state-metrics-0" Jan 28 18:55:44 crc kubenswrapper[4749]: I0128 18:55:44.530450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj2bx\" (UniqueName: \"kubernetes.io/projected/c2a507d2-d263-4695-96e3-4f7af4761450-kube-api-access-jj2bx\") pod \"kube-state-metrics-0\" (UID: \"c2a507d2-d263-4695-96e3-4f7af4761450\") " pod="openstack/kube-state-metrics-0" Jan 28 18:55:44 crc kubenswrapper[4749]: I0128 18:55:44.799271 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.319748 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z"] Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.321022 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.323978 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-45v9k" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.324472 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.336435 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z"] Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.485880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sf7\" (UniqueName: \"kubernetes.io/projected/5d090067-e17c-4708-8d06-23dd0e9da1dc-kube-api-access-57sf7\") pod \"observability-ui-dashboards-66cbf594b5-9lj2z\" (UID: \"5d090067-e17c-4708-8d06-23dd0e9da1dc\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.486209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d090067-e17c-4708-8d06-23dd0e9da1dc-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-9lj2z\" (UID: \"5d090067-e17c-4708-8d06-23dd0e9da1dc\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.588168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sf7\" (UniqueName: \"kubernetes.io/projected/5d090067-e17c-4708-8d06-23dd0e9da1dc-kube-api-access-57sf7\") pod \"observability-ui-dashboards-66cbf594b5-9lj2z\" (UID: \"5d090067-e17c-4708-8d06-23dd0e9da1dc\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.588244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d090067-e17c-4708-8d06-23dd0e9da1dc-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-9lj2z\" (UID: \"5d090067-e17c-4708-8d06-23dd0e9da1dc\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:45 crc kubenswrapper[4749]: E0128 18:55:45.588791 4749 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Jan 28 18:55:45 crc kubenswrapper[4749]: E0128 18:55:45.588895 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d090067-e17c-4708-8d06-23dd0e9da1dc-serving-cert podName:5d090067-e17c-4708-8d06-23dd0e9da1dc nodeName:}" failed. No retries permitted until 2026-01-28 18:55:46.088871572 +0000 UTC m=+1214.100398347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5d090067-e17c-4708-8d06-23dd0e9da1dc-serving-cert") pod "observability-ui-dashboards-66cbf594b5-9lj2z" (UID: "5d090067-e17c-4708-8d06-23dd0e9da1dc") : secret "observability-ui-dashboards" not found Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.621010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sf7\" (UniqueName: \"kubernetes.io/projected/5d090067-e17c-4708-8d06-23dd0e9da1dc-kube-api-access-57sf7\") pod \"observability-ui-dashboards-66cbf594b5-9lj2z\" (UID: \"5d090067-e17c-4708-8d06-23dd0e9da1dc\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.657798 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd8f99d54-pvptd"] Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.664346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.679563 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd8f99d54-pvptd"] Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.815815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-config\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.815884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-serving-cert\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.815921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-trusted-ca-bundle\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.816177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-service-ca\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.816217 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-oauth-config\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.816263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7c5\" (UniqueName: \"kubernetes.io/projected/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-kube-api-access-4n7c5\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.816361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-oauth-serving-cert\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.894059 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.898092 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.901901 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.902440 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.902616 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.902797 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.902957 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.903375 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.909583 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.917656 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l2j2k" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.917828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-config\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.917885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-serving-cert\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.917930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-trusted-ca-bundle\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.918001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-service-ca\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.918018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-oauth-config\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.918043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7c5\" (UniqueName: \"kubernetes.io/projected/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-kube-api-access-4n7c5\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.918077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-oauth-serving-cert\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.919451 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-oauth-serving-cert\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.920579 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-trusted-ca-bundle\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.921062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-service-ca\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.921514 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.925691 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-config\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.931643 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-oauth-config\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.934988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7c5\" (UniqueName: \"kubernetes.io/projected/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-kube-api-access-4n7c5\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:45 crc kubenswrapper[4749]: I0128 18:55:45.949590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff-console-serving-cert\") pod \"console-7cd8f99d54-pvptd\" (UID: \"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff\") " pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.001802 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.019693 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.019800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.019835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.019867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.020044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.020159 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.020758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.020934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dbx\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-kube-api-access-c4dbx\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.021035 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.021063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123060 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d090067-e17c-4708-8d06-23dd0e9da1dc-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-9lj2z\" (UID: \"5d090067-e17c-4708-8d06-23dd0e9da1dc\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123203 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123264 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123293 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dbx\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-kube-api-access-c4dbx\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.123430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.124286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.124433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.125010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.127203 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d090067-e17c-4708-8d06-23dd0e9da1dc-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-9lj2z\" (UID: \"5d090067-e17c-4708-8d06-23dd0e9da1dc\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.128174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.128275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.129012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.129118 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.129599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.134276 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.134319 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/744b74a6be0495f7b599fd99593ce019a865164c2389f972fc00046589d495f0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.145960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dbx\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-kube-api-access-c4dbx\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.180557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"prometheus-metric-storage-0\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.226112 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 18:55:46 crc kubenswrapper[4749]: I0128 18:55:46.250256 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.645306 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.647125 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.655879 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.656230 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-f7rpw" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.656414 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.700448 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.700795 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.709257 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.754838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.755237 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886e17a6-463f-46b1-a745-5420d806f7e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.755362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c9d424e-04e0-49b0-81ff-08627cf2c618\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c9d424e-04e0-49b0-81ff-08627cf2c618\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.755400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/886e17a6-463f-46b1-a745-5420d806f7e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.755428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2zn\" (UniqueName: \"kubernetes.io/projected/886e17a6-463f-46b1-a745-5420d806f7e1-kube-api-access-wk2zn\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.755570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.755724 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.755803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/886e17a6-463f-46b1-a745-5420d806f7e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.857284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886e17a6-463f-46b1-a745-5420d806f7e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.857366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c9d424e-04e0-49b0-81ff-08627cf2c618\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c9d424e-04e0-49b0-81ff-08627cf2c618\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.857387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/886e17a6-463f-46b1-a745-5420d806f7e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.857404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2zn\" (UniqueName: \"kubernetes.io/projected/886e17a6-463f-46b1-a745-5420d806f7e1-kube-api-access-wk2zn\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.857431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.857465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.857493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/886e17a6-463f-46b1-a745-5420d806f7e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.857528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.859875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/886e17a6-463f-46b1-a745-5420d806f7e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.860234 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886e17a6-463f-46b1-a745-5420d806f7e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.860778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/886e17a6-463f-46b1-a745-5420d806f7e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.873542 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.873577 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c9d424e-04e0-49b0-81ff-08627cf2c618\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c9d424e-04e0-49b0-81ff-08627cf2c618\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/432a63986c501a11e58e21a969515db53da1bcdba62a223f2229d8b002387e2e/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.874482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.875957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.876126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886e17a6-463f-46b1-a745-5420d806f7e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.876696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2zn\" (UniqueName: \"kubernetes.io/projected/886e17a6-463f-46b1-a745-5420d806f7e1-kube-api-access-wk2zn\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:47 crc kubenswrapper[4749]: I0128 18:55:47.912129 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c9d424e-04e0-49b0-81ff-08627cf2c618\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c9d424e-04e0-49b0-81ff-08627cf2c618\") pod \"ovsdbserver-nb-0\" (UID: \"886e17a6-463f-46b1-a745-5420d806f7e1\") " pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.032178 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.860107 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-g8gw7"] Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.861970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.865258 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ncr8m" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.865604 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.865740 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.900420 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g8gw7"] Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.948733 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7z7wf"] Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.962708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.968019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7z7wf"] Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.988512 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca89079d-f8be-4e31-a78d-1c4257260a8f-scripts\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.988593 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-log-ovn\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.988645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca89079d-f8be-4e31-a78d-1c4257260a8f-ovn-controller-tls-certs\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.988679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca89079d-f8be-4e31-a78d-1c4257260a8f-combined-ca-bundle\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.988945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-run-ovn\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.989052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l22j4\" (UniqueName: \"kubernetes.io/projected/ca89079d-f8be-4e31-a78d-1c4257260a8f-kube-api-access-l22j4\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:48 crc kubenswrapper[4749]: I0128 18:55:48.989196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-run\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94c11021-07ce-4db3-ae6d-19ff89737e77-scripts\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-lib\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca89079d-f8be-4e31-a78d-1c4257260a8f-ovn-controller-tls-certs\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-log\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca89079d-f8be-4e31-a78d-1c4257260a8f-combined-ca-bundle\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-run\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-run-ovn\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22j4\" (UniqueName: \"kubernetes.io/projected/ca89079d-f8be-4e31-a78d-1c4257260a8f-kube-api-access-l22j4\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-run\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091486 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-etc-ovs\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca89079d-f8be-4e31-a78d-1c4257260a8f-scripts\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxr4r\" (UniqueName: \"kubernetes.io/projected/94c11021-07ce-4db3-ae6d-19ff89737e77-kube-api-access-gxr4r\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-log-ovn\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-run-ovn\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-log-ovn\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.091970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca89079d-f8be-4e31-a78d-1c4257260a8f-var-run\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.094641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca89079d-f8be-4e31-a78d-1c4257260a8f-combined-ca-bundle\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.094637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca89079d-f8be-4e31-a78d-1c4257260a8f-ovn-controller-tls-certs\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.094733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca89079d-f8be-4e31-a78d-1c4257260a8f-scripts\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.110667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22j4\" (UniqueName: \"kubernetes.io/projected/ca89079d-f8be-4e31-a78d-1c4257260a8f-kube-api-access-l22j4\") pod \"ovn-controller-g8gw7\" (UID: \"ca89079d-f8be-4e31-a78d-1c4257260a8f\") " pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94c11021-07ce-4db3-ae6d-19ff89737e77-scripts\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-lib\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-log\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-run\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193300 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-etc-ovs\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxr4r\" (UniqueName: \"kubernetes.io/projected/94c11021-07ce-4db3-ae6d-19ff89737e77-kube-api-access-gxr4r\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-run\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193456 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-log\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-etc-ovs\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.193693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/94c11021-07ce-4db3-ae6d-19ff89737e77-var-lib\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.195621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94c11021-07ce-4db3-ae6d-19ff89737e77-scripts\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.213091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxr4r\" (UniqueName: \"kubernetes.io/projected/94c11021-07ce-4db3-ae6d-19ff89737e77-kube-api-access-gxr4r\") pod \"ovn-controller-ovs-7z7wf\" (UID: \"94c11021-07ce-4db3-ae6d-19ff89737e77\") " pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.220111 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8gw7" Jan 28 18:55:49 crc kubenswrapper[4749]: I0128 18:55:49.287106 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.348046 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.350619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.355701 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-pdnhd" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.357985 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.358270 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.358555 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.374495 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.438110 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5279af8d-ea25-418b-870f-71308c403b44-config\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.438224 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.438249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5279af8d-ea25-418b-870f-71308c403b44-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.438268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.438341 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6x7l\" (UniqueName: \"kubernetes.io/projected/5279af8d-ea25-418b-870f-71308c403b44-kube-api-access-f6x7l\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.438381 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83c0be4b-b54c-45ac-b83b-ab56c818f3db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83c0be4b-b54c-45ac-b83b-ab56c818f3db\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.439067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.439319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5279af8d-ea25-418b-870f-71308c403b44-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.543016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.543080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5279af8d-ea25-418b-870f-71308c403b44-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.543106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.543164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6x7l\" (UniqueName: \"kubernetes.io/projected/5279af8d-ea25-418b-870f-71308c403b44-kube-api-access-f6x7l\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.543192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83c0be4b-b54c-45ac-b83b-ab56c818f3db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83c0be4b-b54c-45ac-b83b-ab56c818f3db\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.543240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.543457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5279af8d-ea25-418b-870f-71308c403b44-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.543814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5279af8d-ea25-418b-870f-71308c403b44-config\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.544546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5279af8d-ea25-418b-870f-71308c403b44-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.544807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5279af8d-ea25-418b-870f-71308c403b44-config\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.545011 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5279af8d-ea25-418b-870f-71308c403b44-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.549220 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.549253 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83c0be4b-b54c-45ac-b83b-ab56c818f3db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83c0be4b-b54c-45ac-b83b-ab56c818f3db\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f4e72eb9b8066da245e8e316d9142ceca971098e8c1341d732e1abf317fb3274/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.549879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.550217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.559221 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5279af8d-ea25-418b-870f-71308c403b44-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.561963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6x7l\" (UniqueName: \"kubernetes.io/projected/5279af8d-ea25-418b-870f-71308c403b44-kube-api-access-f6x7l\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.581181 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83c0be4b-b54c-45ac-b83b-ab56c818f3db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83c0be4b-b54c-45ac-b83b-ab56c818f3db\") pod \"ovsdbserver-sb-0\" (UID: \"5279af8d-ea25-418b-870f-71308c403b44\") " pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:51 crc kubenswrapper[4749]: I0128 18:55:51.682765 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 28 18:55:56 crc kubenswrapper[4749]: E0128 18:55:56.744171 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 28 18:55:56 crc kubenswrapper[4749]: E0128 18:55:56.744916 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-49kqf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-qzkpr_openstack(699d2959-eac0-4edd-a8c7-cd893f426b40): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:55:56 crc kubenswrapper[4749]: E0128 18:55:56.746174 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" podUID="699d2959-eac0-4edd-a8c7-cd893f426b40" Jan 28 18:55:56 crc kubenswrapper[4749]: E0128 18:55:56.792428 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 28 18:55:56 crc kubenswrapper[4749]: E0128 18:55:56.792758 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qjnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-6kmfq_openstack(87ffdafe-209d-4c00-a039-5353a513db9d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:55:56 crc kubenswrapper[4749]: E0128 18:55:56.794842 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" podUID="87ffdafe-209d-4c00-a039-5353a513db9d" Jan 28 18:55:57 crc kubenswrapper[4749]: I0128 18:55:57.781775 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mn7sc"] Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.476625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" event={"ID":"4d1004cb-d7e0-4f3a-9739-33d4afdec629","Type":"ContainerStarted","Data":"c19280f3ccab2f1a9b0026e24438b681f80f5de4542f62604720af4682558b3c"} Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.724653 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.729120 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.799894 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z"] Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.817578 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49kqf\" (UniqueName: \"kubernetes.io/projected/699d2959-eac0-4edd-a8c7-cd893f426b40-kube-api-access-49kqf\") pod \"699d2959-eac0-4edd-a8c7-cd893f426b40\" (UID: \"699d2959-eac0-4edd-a8c7-cd893f426b40\") " Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.817645 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qjnk\" (UniqueName: \"kubernetes.io/projected/87ffdafe-209d-4c00-a039-5353a513db9d-kube-api-access-8qjnk\") pod \"87ffdafe-209d-4c00-a039-5353a513db9d\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.817694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-dns-svc\") pod \"87ffdafe-209d-4c00-a039-5353a513db9d\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.817733 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699d2959-eac0-4edd-a8c7-cd893f426b40-config\") pod \"699d2959-eac0-4edd-a8c7-cd893f426b40\" (UID: \"699d2959-eac0-4edd-a8c7-cd893f426b40\") " Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.817856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-config\") pod \"87ffdafe-209d-4c00-a039-5353a513db9d\" (UID: \"87ffdafe-209d-4c00-a039-5353a513db9d\") " Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.819198 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87ffdafe-209d-4c00-a039-5353a513db9d" (UID: "87ffdafe-209d-4c00-a039-5353a513db9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.819214 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699d2959-eac0-4edd-a8c7-cd893f426b40-config" (OuterVolumeSpecName: "config") pod "699d2959-eac0-4edd-a8c7-cd893f426b40" (UID: "699d2959-eac0-4edd-a8c7-cd893f426b40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.825977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699d2959-eac0-4edd-a8c7-cd893f426b40-kube-api-access-49kqf" (OuterVolumeSpecName: "kube-api-access-49kqf") pod "699d2959-eac0-4edd-a8c7-cd893f426b40" (UID: "699d2959-eac0-4edd-a8c7-cd893f426b40"). InnerVolumeSpecName "kube-api-access-49kqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.828709 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ffdafe-209d-4c00-a039-5353a513db9d-kube-api-access-8qjnk" (OuterVolumeSpecName: "kube-api-access-8qjnk") pod "87ffdafe-209d-4c00-a039-5353a513db9d" (UID: "87ffdafe-209d-4c00-a039-5353a513db9d"). InnerVolumeSpecName "kube-api-access-8qjnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.834429 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd8f99d54-pvptd"] Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.836425 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-config" (OuterVolumeSpecName: "config") pod "87ffdafe-209d-4c00-a039-5353a513db9d" (UID: "87ffdafe-209d-4c00-a039-5353a513db9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.845711 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 28 18:55:58 crc kubenswrapper[4749]: W0128 18:55:58.849360 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94850daa_65af_4e6a_ad29_cfa28c3076e7.slice/crio-00881a7f22c23fe3fe2f5c757805e9d095a8f91e89514ba11239e58f649bd435 WatchSource:0}: Error finding container 00881a7f22c23fe3fe2f5c757805e9d095a8f91e89514ba11239e58f649bd435: Status 404 returned error can't find the container with id 00881a7f22c23fe3fe2f5c757805e9d095a8f91e89514ba11239e58f649bd435 Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.861425 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 28 18:55:58 crc kubenswrapper[4749]: W0128 18:55:58.879140 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5954ab85_e42a_498a_ae91_fd46445c0860.slice/crio-73b21d5acc0f9e022ffda7eeb75ac36e14a639cb67baa4d42a6d7f15e5ea7470 WatchSource:0}: Error finding container 73b21d5acc0f9e022ffda7eeb75ac36e14a639cb67baa4d42a6d7f15e5ea7470: Status 404 returned error can't find the container with id 73b21d5acc0f9e022ffda7eeb75ac36e14a639cb67baa4d42a6d7f15e5ea7470 Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.884593 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.884627 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 18:55:58 crc kubenswrapper[4749]: W0128 18:55:58.888220 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c85419b_39a8_4c29_bff0_475dfc988a32.slice/crio-3d1b0c2d1ce00dd372871bcad75dd7f5f665af2d523a33a37ecbd55547c32e6c WatchSource:0}: Error finding container 3d1b0c2d1ce00dd372871bcad75dd7f5f665af2d523a33a37ecbd55547c32e6c: Status 404 returned error can't find the container with id 3d1b0c2d1ce00dd372871bcad75dd7f5f665af2d523a33a37ecbd55547c32e6c Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.892147 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.900917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.920853 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.920883 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49kqf\" (UniqueName: \"kubernetes.io/projected/699d2959-eac0-4edd-a8c7-cd893f426b40-kube-api-access-49kqf\") on node \"crc\" DevicePath \"\"" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.920893 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qjnk\" (UniqueName: \"kubernetes.io/projected/87ffdafe-209d-4c00-a039-5353a513db9d-kube-api-access-8qjnk\") on node \"crc\" DevicePath \"\"" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.920902 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87ffdafe-209d-4c00-a039-5353a513db9d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:55:58 crc kubenswrapper[4749]: I0128 18:55:58.920911 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699d2959-eac0-4edd-a8c7-cd893f426b40-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.391290 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wgmbm"] Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.403679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.419494 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.432919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 18:55:59 crc kubenswrapper[4749]: W0128 18:55:59.433451 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b3ee51e_214c_4efe_875b_51db3beea94c.slice/crio-4b6932209776ad1d9fcd82fa7042bc7335e2b6471ad99dbaf202bcbe57459a9c WatchSource:0}: Error finding container 4b6932209776ad1d9fcd82fa7042bc7335e2b6471ad99dbaf202bcbe57459a9c: Status 404 returned error can't find the container with id 4b6932209776ad1d9fcd82fa7042bc7335e2b6471ad99dbaf202bcbe57459a9c Jan 28 18:55:59 crc kubenswrapper[4749]: W0128 18:55:59.443189 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce4d6a9_0e4d_4da3_903d_6d29e70d6b0b.slice/crio-23f715f017cd27f19d95dadab0bcb6898bf9d9776105967a2cbce19d59e09467 WatchSource:0}: Error finding container 23f715f017cd27f19d95dadab0bcb6898bf9d9776105967a2cbce19d59e09467: Status 404 returned error can't find the container with id 23f715f017cd27f19d95dadab0bcb6898bf9d9776105967a2cbce19d59e09467 Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.447095 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g8gw7"] Jan 28 18:55:59 crc kubenswrapper[4749]: W0128 18:55:59.452507 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fa32822_e9c9_4eb5_8a9b_d8f0ef76177e.slice/crio-239107c674991aad52aefd406f64002e65623337fa2555ee930421c4bcd060c8 WatchSource:0}: Error finding container 239107c674991aad52aefd406f64002e65623337fa2555ee930421c4bcd060c8: Status 404 returned error can't find the container with id 239107c674991aad52aefd406f64002e65623337fa2555ee930421c4bcd060c8 Jan 28 18:55:59 crc kubenswrapper[4749]: W0128 18:55:59.466080 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca89079d_f8be_4e31_a78d_1c4257260a8f.slice/crio-474475f76309b8faa11f00b28b8ab6375ebf6324496729a655d900554bf6876e WatchSource:0}: Error finding container 474475f76309b8faa11f00b28b8ab6375ebf6324496729a655d900554bf6876e: Status 404 returned error can't find the container with id 474475f76309b8faa11f00b28b8ab6375ebf6324496729a655d900554bf6876e Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.515199 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" event={"ID":"8b3ee51e-214c-4efe-875b-51db3beea94c","Type":"ContainerStarted","Data":"4b6932209776ad1d9fcd82fa7042bc7335e2b6471ad99dbaf202bcbe57459a9c"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.519607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerStarted","Data":"23f715f017cd27f19d95dadab0bcb6898bf9d9776105967a2cbce19d59e09467"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.521186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8gw7" event={"ID":"ca89079d-f8be-4e31-a78d-1c4257260a8f","Type":"ContainerStarted","Data":"474475f76309b8faa11f00b28b8ab6375ebf6324496729a655d900554bf6876e"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.526396 4749 generic.go:334] "Generic (PLEG): container finished" podID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" containerID="209f03b475999b55ac44f0d3931a0be27d7b14ca7b102884695a6728ff43801e" exitCode=0 Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.526513 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" event={"ID":"4d1004cb-d7e0-4f3a-9739-33d4afdec629","Type":"ContainerDied","Data":"209f03b475999b55ac44f0d3931a0be27d7b14ca7b102884695a6728ff43801e"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.533697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2a507d2-d263-4695-96e3-4f7af4761450","Type":"ContainerStarted","Data":"e72752871dc7960ae74be230ed3f750b552f6d40a39e3e888dac6dbc6c346d22"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.538478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd8f99d54-pvptd" event={"ID":"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff","Type":"ContainerStarted","Data":"29621197185bf82449cb7cbde5d7486e61671dc35d19a18271cb6d43a7e4badc"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.538531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd8f99d54-pvptd" event={"ID":"4ceb5aa9-dc58-43f3-b807-48cf6c4d6bff","Type":"ContainerStarted","Data":"cedb5ee2c29a1674441f0fe456e03bdafad04f7348cfbe3484868e030509ea2d"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.557937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" event={"ID":"5d090067-e17c-4708-8d06-23dd0e9da1dc","Type":"ContainerStarted","Data":"d27763aa39a6f07ccb1419bb261bdaf0ddadf2d353a3c277fe820df89cc4f30e"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.565964 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a12543-c0a3-486f-b5bd-4f2862c15a37","Type":"ContainerStarted","Data":"2a5a0cf864b08bca525c9f75e85931ca080d1a53df44eaeb64b541c32472093d"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.570490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5954ab85-e42a-498a-ae91-fd46445c0860","Type":"ContainerStarted","Data":"73b21d5acc0f9e022ffda7eeb75ac36e14a639cb67baa4d42a6d7f15e5ea7470"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.576320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"94850daa-65af-4e6a-ad29-cfa28c3076e7","Type":"ContainerStarted","Data":"00881a7f22c23fe3fe2f5c757805e9d095a8f91e89514ba11239e58f649bd435"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.580290 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd8f99d54-pvptd" podStartSLOduration=14.5802707 podStartE2EDuration="14.5802707s" podCreationTimestamp="2026-01-28 18:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:55:59.568636412 +0000 UTC m=+1227.580163207" watchObservedRunningTime="2026-01-28 18:55:59.5802707 +0000 UTC m=+1227.591797475" Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.585811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1c85419b-39a8-4c29-bff0-475dfc988a32","Type":"ContainerStarted","Data":"3d1b0c2d1ce00dd372871bcad75dd7f5f665af2d523a33a37ecbd55547c32e6c"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.590297 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e","Type":"ContainerStarted","Data":"239107c674991aad52aefd406f64002e65623337fa2555ee930421c4bcd060c8"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.592631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" event={"ID":"699d2959-eac0-4edd-a8c7-cd893f426b40","Type":"ContainerDied","Data":"1f794555b62c19e4cbecf8505903cc535a968e66dbab2a9fddad461c03f6c96a"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.592729 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qzkpr" Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.596584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"772600b2-9086-4d72-bb86-6edfb0a21b35","Type":"ContainerStarted","Data":"6bff4d403112ae221a103d4dcd8c54463bc92541cad993de2f996cfd727b0ffc"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.601696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6c01d945-dccd-468e-b6ce-d269f1715462","Type":"ContainerStarted","Data":"ec9a0ee87e6bb2a65865651ad0c9fcd453ac5e4a7bb8f5cb8195a9b0c8281586"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.604130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" event={"ID":"87ffdafe-209d-4c00-a039-5353a513db9d","Type":"ContainerDied","Data":"31b3abd2151d00e4aaa12223afddbc6b23b31b76a8508d98eef71eab32b5e13d"} Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.604202 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6kmfq" Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.711248 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qzkpr"] Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.756755 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qzkpr"] Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.800905 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6kmfq"] Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.814976 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6kmfq"] Jan 28 18:55:59 crc kubenswrapper[4749]: W0128 18:55:59.822833 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c11021_07ce_4db3_ae6d_19ff89737e77.slice/crio-301b829a5218c81ec0aad33d4f2ef2a7b3f096579930ed757fc4d5918bfb2086 WatchSource:0}: Error finding container 301b829a5218c81ec0aad33d4f2ef2a7b3f096579930ed757fc4d5918bfb2086: Status 404 returned error can't find the container with id 301b829a5218c81ec0aad33d4f2ef2a7b3f096579930ed757fc4d5918bfb2086 Jan 28 18:55:59 crc kubenswrapper[4749]: I0128 18:55:59.828083 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7z7wf"] Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.463743 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.566542 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 18:56:00 crc kubenswrapper[4749]: W0128 18:56:00.608562 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod886e17a6_463f_46b1_a745_5420d806f7e1.slice/crio-928c4af04983724aca0ac2f96dfc8861cc917eb4dd5f5470ee20fb824510b2c0 WatchSource:0}: Error finding container 928c4af04983724aca0ac2f96dfc8861cc917eb4dd5f5470ee20fb824510b2c0: Status 404 returned error can't find the container with id 928c4af04983724aca0ac2f96dfc8861cc917eb4dd5f5470ee20fb824510b2c0 Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.622527 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" event={"ID":"4d1004cb-d7e0-4f3a-9739-33d4afdec629","Type":"ContainerStarted","Data":"2258e84cbaa77357328448c6481da45de99e4a0c7644c4705401df368db01a77"} Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.622608 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.625512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7z7wf" event={"ID":"94c11021-07ce-4db3-ae6d-19ff89737e77","Type":"ContainerStarted","Data":"301b829a5218c81ec0aad33d4f2ef2a7b3f096579930ed757fc4d5918bfb2086"} Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.629813 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b3ee51e-214c-4efe-875b-51db3beea94c" containerID="a88a4b6f1c8b54108130338b0cfa2f5271487458aceec5c60d2ac14ff2e4747d" exitCode=0 Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.629847 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" event={"ID":"8b3ee51e-214c-4efe-875b-51db3beea94c","Type":"ContainerDied","Data":"a88a4b6f1c8b54108130338b0cfa2f5271487458aceec5c60d2ac14ff2e4747d"} Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.649433 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" podStartSLOduration=22.11639876 podStartE2EDuration="22.649410923s" podCreationTimestamp="2026-01-28 18:55:38 +0000 UTC" firstStartedPulling="2026-01-28 18:55:57.798426839 +0000 UTC m=+1225.809953614" lastFinishedPulling="2026-01-28 18:55:58.331439002 +0000 UTC m=+1226.342965777" observedRunningTime="2026-01-28 18:56:00.641307322 +0000 UTC m=+1228.652834097" watchObservedRunningTime="2026-01-28 18:56:00.649410923 +0000 UTC m=+1228.660937698" Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.892460 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699d2959-eac0-4edd-a8c7-cd893f426b40" path="/var/lib/kubelet/pods/699d2959-eac0-4edd-a8c7-cd893f426b40/volumes" Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.895105 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ffdafe-209d-4c00-a039-5353a513db9d" path="/var/lib/kubelet/pods/87ffdafe-209d-4c00-a039-5353a513db9d/volumes" Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.930484 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nlxrf"] Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.932015 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.933944 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 28 18:56:00 crc kubenswrapper[4749]: I0128 18:56:00.938808 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nlxrf"] Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.070591 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mn7sc"] Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.090058 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-fxd7v"] Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.091661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.094580 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.105510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43a4085d-f42a-4a25-8a00-8cfc6c771821-ovn-rundir\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.105585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a4085d-f42a-4a25-8a00-8cfc6c771821-combined-ca-bundle\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.105617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43a4085d-f42a-4a25-8a00-8cfc6c771821-ovs-rundir\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.105712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h526m\" (UniqueName: \"kubernetes.io/projected/43a4085d-f42a-4a25-8a00-8cfc6c771821-kube-api-access-h526m\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.105739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a4085d-f42a-4a25-8a00-8cfc6c771821-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.105760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a4085d-f42a-4a25-8a00-8cfc6c771821-config\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.112805 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-fxd7v"] Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.215532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-config\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.215759 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h526m\" (UniqueName: \"kubernetes.io/projected/43a4085d-f42a-4a25-8a00-8cfc6c771821-kube-api-access-h526m\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.215813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a4085d-f42a-4a25-8a00-8cfc6c771821-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.216036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a4085d-f42a-4a25-8a00-8cfc6c771821-config\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.216091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.216170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2tdl\" (UniqueName: \"kubernetes.io/projected/df19a7b0-50b1-418f-9822-48f47b2be635-kube-api-access-p2tdl\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.216191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43a4085d-f42a-4a25-8a00-8cfc6c771821-ovn-rundir\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.217541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a4085d-f42a-4a25-8a00-8cfc6c771821-combined-ca-bundle\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.217589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.217633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43a4085d-f42a-4a25-8a00-8cfc6c771821-ovs-rundir\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.217769 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43a4085d-f42a-4a25-8a00-8cfc6c771821-ovn-rundir\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.217809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43a4085d-f42a-4a25-8a00-8cfc6c771821-ovs-rundir\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.219434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a4085d-f42a-4a25-8a00-8cfc6c771821-config\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.229976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a4085d-f42a-4a25-8a00-8cfc6c771821-combined-ca-bundle\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.230641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a4085d-f42a-4a25-8a00-8cfc6c771821-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.246116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h526m\" (UniqueName: \"kubernetes.io/projected/43a4085d-f42a-4a25-8a00-8cfc6c771821-kube-api-access-h526m\") pod \"ovn-controller-metrics-nlxrf\" (UID: \"43a4085d-f42a-4a25-8a00-8cfc6c771821\") " pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.277674 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wgmbm"] Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.278675 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nlxrf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.303393 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s9fdf"] Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.305098 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.310637 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.314005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s9fdf"] Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.319399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-config\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.319567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.319611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2tdl\" (UniqueName: \"kubernetes.io/projected/df19a7b0-50b1-418f-9822-48f47b2be635-kube-api-access-p2tdl\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.319661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.320742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.321440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-config\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.322071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.360623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2tdl\" (UniqueName: \"kubernetes.io/projected/df19a7b0-50b1-418f-9822-48f47b2be635-kube-api-access-p2tdl\") pod \"dnsmasq-dns-7fd796d7df-fxd7v\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.421640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-config\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.421697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbj2b\" (UniqueName: \"kubernetes.io/projected/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-kube-api-access-nbj2b\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.421731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.422045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.422285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.425495 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.525055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.525174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-config\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.525202 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbj2b\" (UniqueName: \"kubernetes.io/projected/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-kube-api-access-nbj2b\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.525230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.525318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.526068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.526410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-config\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.526621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.526725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.544225 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbj2b\" (UniqueName: \"kubernetes.io/projected/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-kube-api-access-nbj2b\") pod \"dnsmasq-dns-86db49b7ff-s9fdf\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.645573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5279af8d-ea25-418b-870f-71308c403b44","Type":"ContainerStarted","Data":"236dc8e45b8026a0a3f3676f42c3670a300640852a643735e483fe1ad896bcc8"} Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.647345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"886e17a6-463f-46b1-a745-5420d806f7e1","Type":"ContainerStarted","Data":"928c4af04983724aca0ac2f96dfc8861cc917eb4dd5f5470ee20fb824510b2c0"} Jan 28 18:56:01 crc kubenswrapper[4749]: I0128 18:56:01.709351 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:02 crc kubenswrapper[4749]: I0128 18:56:02.656887 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" podUID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" containerName="dnsmasq-dns" containerID="cri-o://2258e84cbaa77357328448c6481da45de99e4a0c7644c4705401df368db01a77" gracePeriod=10 Jan 28 18:56:03 crc kubenswrapper[4749]: I0128 18:56:03.678386 4749 generic.go:334] "Generic (PLEG): container finished" podID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" containerID="2258e84cbaa77357328448c6481da45de99e4a0c7644c4705401df368db01a77" exitCode=0 Jan 28 18:56:03 crc kubenswrapper[4749]: I0128 18:56:03.679029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" event={"ID":"4d1004cb-d7e0-4f3a-9739-33d4afdec629","Type":"ContainerDied","Data":"2258e84cbaa77357328448c6481da45de99e4a0c7644c4705401df368db01a77"} Jan 28 18:56:06 crc kubenswrapper[4749]: I0128 18:56:06.002598 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:56:06 crc kubenswrapper[4749]: I0128 18:56:06.002922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:56:06 crc kubenswrapper[4749]: I0128 18:56:06.008365 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:56:06 crc kubenswrapper[4749]: I0128 18:56:06.707126 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd8f99d54-pvptd" Jan 28 18:56:06 crc kubenswrapper[4749]: I0128 18:56:06.764195 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ff46b7d8d-w6bmg"] Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.826700 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.895432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-config\") pod \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.895587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-dns-svc\") pod \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.895692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7npc7\" (UniqueName: \"kubernetes.io/projected/4d1004cb-d7e0-4f3a-9739-33d4afdec629-kube-api-access-7npc7\") pod \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\" (UID: \"4d1004cb-d7e0-4f3a-9739-33d4afdec629\") " Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.903250 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1004cb-d7e0-4f3a-9739-33d4afdec629-kube-api-access-7npc7" (OuterVolumeSpecName: "kube-api-access-7npc7") pod "4d1004cb-d7e0-4f3a-9739-33d4afdec629" (UID: "4d1004cb-d7e0-4f3a-9739-33d4afdec629"). InnerVolumeSpecName "kube-api-access-7npc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.977357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d1004cb-d7e0-4f3a-9739-33d4afdec629" (UID: "4d1004cb-d7e0-4f3a-9739-33d4afdec629"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.994130 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-config" (OuterVolumeSpecName: "config") pod "4d1004cb-d7e0-4f3a-9739-33d4afdec629" (UID: "4d1004cb-d7e0-4f3a-9739-33d4afdec629"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.998609 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.998636 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d1004cb-d7e0-4f3a-9739-33d4afdec629-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:07 crc kubenswrapper[4749]: I0128 18:56:07.998648 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7npc7\" (UniqueName: \"kubernetes.io/projected/4d1004cb-d7e0-4f3a-9739-33d4afdec629-kube-api-access-7npc7\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:08 crc kubenswrapper[4749]: I0128 18:56:08.725590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" event={"ID":"4d1004cb-d7e0-4f3a-9739-33d4afdec629","Type":"ContainerDied","Data":"c19280f3ccab2f1a9b0026e24438b681f80f5de4542f62604720af4682558b3c"} Jan 28 18:56:08 crc kubenswrapper[4749]: I0128 18:56:08.725904 4749 scope.go:117] "RemoveContainer" containerID="2258e84cbaa77357328448c6481da45de99e4a0c7644c4705401df368db01a77" Jan 28 18:56:08 crc kubenswrapper[4749]: I0128 18:56:08.726022 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mn7sc" Jan 28 18:56:08 crc kubenswrapper[4749]: I0128 18:56:08.774239 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mn7sc"] Jan 28 18:56:08 crc kubenswrapper[4749]: I0128 18:56:08.813302 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mn7sc"] Jan 28 18:56:08 crc kubenswrapper[4749]: I0128 18:56:08.885010 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" path="/var/lib/kubelet/pods/4d1004cb-d7e0-4f3a-9739-33d4afdec629/volumes" Jan 28 18:56:09 crc kubenswrapper[4749]: I0128 18:56:09.066410 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-fxd7v"] Jan 28 18:56:09 crc kubenswrapper[4749]: I0128 18:56:09.518735 4749 scope.go:117] "RemoveContainer" containerID="209f03b475999b55ac44f0d3931a0be27d7b14ca7b102884695a6728ff43801e" Jan 28 18:56:09 crc kubenswrapper[4749]: I0128 18:56:09.586992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nlxrf"] Jan 28 18:56:09 crc kubenswrapper[4749]: W0128 18:56:09.662457 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43a4085d_f42a_4a25_8a00_8cfc6c771821.slice/crio-70da940c7ab07eadc6a9b2b45b5b0402ff25ef6a09707637a1900ac5e67af852 WatchSource:0}: Error finding container 70da940c7ab07eadc6a9b2b45b5b0402ff25ef6a09707637a1900ac5e67af852: Status 404 returned error can't find the container with id 70da940c7ab07eadc6a9b2b45b5b0402ff25ef6a09707637a1900ac5e67af852 Jan 28 18:56:09 crc kubenswrapper[4749]: I0128 18:56:09.690868 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s9fdf"] Jan 28 18:56:09 crc kubenswrapper[4749]: I0128 18:56:09.733838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nlxrf" event={"ID":"43a4085d-f42a-4a25-8a00-8cfc6c771821","Type":"ContainerStarted","Data":"70da940c7ab07eadc6a9b2b45b5b0402ff25ef6a09707637a1900ac5e67af852"} Jan 28 18:56:09 crc kubenswrapper[4749]: I0128 18:56:09.736064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" event={"ID":"df19a7b0-50b1-418f-9822-48f47b2be635","Type":"ContainerStarted","Data":"a19450ec92d3794afc231dd341e69d248658b200ec53990d3af7f33003151dc5"} Jan 28 18:56:10 crc kubenswrapper[4749]: I0128 18:56:10.747337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" event={"ID":"68126ff8-8a74-4130-ba3f-a806dc4a8b2a","Type":"ContainerStarted","Data":"fb152edf0a955dac2809add73f6b8c21c68d0839db10eaaf17e12eeaf0aa38e4"} Jan 28 18:56:10 crc kubenswrapper[4749]: I0128 18:56:10.750002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" event={"ID":"5d090067-e17c-4708-8d06-23dd0e9da1dc","Type":"ContainerStarted","Data":"b28ce429c74da915139344608aacf4f9d5ea58b51e0ce7198c9f548beb62745d"} Jan 28 18:56:10 crc kubenswrapper[4749]: I0128 18:56:10.767578 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-9lj2z" podStartSLOduration=16.605070075 podStartE2EDuration="25.767562905s" podCreationTimestamp="2026-01-28 18:55:45 +0000 UTC" firstStartedPulling="2026-01-28 18:55:58.817964713 +0000 UTC m=+1226.829491488" lastFinishedPulling="2026-01-28 18:56:07.980457553 +0000 UTC m=+1235.991984318" observedRunningTime="2026-01-28 18:56:10.765895164 +0000 UTC m=+1238.777421939" watchObservedRunningTime="2026-01-28 18:56:10.767562905 +0000 UTC m=+1238.779089680" Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.761159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1c85419b-39a8-4c29-bff0-475dfc988a32","Type":"ContainerStarted","Data":"01b977bd25255a679238d1fc091eb74e8576271217641c47d4bd4c1f7a1fb129"} Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.761776 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.763034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" event={"ID":"8b3ee51e-214c-4efe-875b-51db3beea94c","Type":"ContainerStarted","Data":"aa9cf9ee1765d865ef0f1f0800263bea431fd671a88a13f2c7fd58e4d3538591"} Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.763279 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.763160 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" podUID="8b3ee51e-214c-4efe-875b-51db3beea94c" containerName="dnsmasq-dns" containerID="cri-o://aa9cf9ee1765d865ef0f1f0800263bea431fd671a88a13f2c7fd58e4d3538591" gracePeriod=10 Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.768267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6c01d945-dccd-468e-b6ce-d269f1715462","Type":"ContainerStarted","Data":"ba69ea8f43e82495d3d0e7e058d027b50b859ad9b2fa9814bdc66b981b428667"} Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.770763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e","Type":"ContainerStarted","Data":"3df2fe5bca8013ddf9c4561ea063551f8c6439e126a8975585ddbc3344eac82e"} Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.778058 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5279af8d-ea25-418b-870f-71308c403b44","Type":"ContainerStarted","Data":"22cb659047577d20bc91d3cbdfbbea803f20247aa983ddd08697949d1ac90ea2"} Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.779872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7z7wf" event={"ID":"94c11021-07ce-4db3-ae6d-19ff89737e77","Type":"ContainerStarted","Data":"071d19124fefdefd933430ae19126d8408196a11f8d22b959f9319cc1965537c"} Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.802639 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.720927556 podStartE2EDuration="29.802616503s" podCreationTimestamp="2026-01-28 18:55:42 +0000 UTC" firstStartedPulling="2026-01-28 18:55:58.891242869 +0000 UTC m=+1226.902769654" lastFinishedPulling="2026-01-28 18:56:07.972931826 +0000 UTC m=+1235.984458601" observedRunningTime="2026-01-28 18:56:11.781007248 +0000 UTC m=+1239.792534043" watchObservedRunningTime="2026-01-28 18:56:11.802616503 +0000 UTC m=+1239.814143278" Jan 28 18:56:11 crc kubenswrapper[4749]: I0128 18:56:11.828727 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" podStartSLOduration=33.82870584 podStartE2EDuration="33.82870584s" podCreationTimestamp="2026-01-28 18:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:56:11.810489919 +0000 UTC m=+1239.822016714" watchObservedRunningTime="2026-01-28 18:56:11.82870584 +0000 UTC m=+1239.840232615" Jan 28 18:56:12 crc kubenswrapper[4749]: I0128 18:56:12.792227 4749 generic.go:334] "Generic (PLEG): container finished" podID="94c11021-07ce-4db3-ae6d-19ff89737e77" containerID="071d19124fefdefd933430ae19126d8408196a11f8d22b959f9319cc1965537c" exitCode=0 Jan 28 18:56:12 crc kubenswrapper[4749]: I0128 18:56:12.792342 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7z7wf" event={"ID":"94c11021-07ce-4db3-ae6d-19ff89737e77","Type":"ContainerDied","Data":"071d19124fefdefd933430ae19126d8408196a11f8d22b959f9319cc1965537c"} Jan 28 18:56:12 crc kubenswrapper[4749]: I0128 18:56:12.797826 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b3ee51e-214c-4efe-875b-51db3beea94c" containerID="aa9cf9ee1765d865ef0f1f0800263bea431fd671a88a13f2c7fd58e4d3538591" exitCode=0 Jan 28 18:56:12 crc kubenswrapper[4749]: I0128 18:56:12.797904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" event={"ID":"8b3ee51e-214c-4efe-875b-51db3beea94c","Type":"ContainerDied","Data":"aa9cf9ee1765d865ef0f1f0800263bea431fd671a88a13f2c7fd58e4d3538591"} Jan 28 18:56:12 crc kubenswrapper[4749]: I0128 18:56:12.799438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a12543-c0a3-486f-b5bd-4f2862c15a37","Type":"ContainerStarted","Data":"e1aa21e768bf671ed2327d9375f72fe33894fdcd7c739d2a4ad6e80fa40db665"} Jan 28 18:56:12 crc kubenswrapper[4749]: I0128 18:56:12.802253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5954ab85-e42a-498a-ae91-fd46445c0860","Type":"ContainerStarted","Data":"141990bcc151dcca3b6ff439514702fe0b63c85be8fb7fd086262326aeaa5229"} Jan 28 18:56:13 crc kubenswrapper[4749]: I0128 18:56:13.811991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"94850daa-65af-4e6a-ad29-cfa28c3076e7","Type":"ContainerStarted","Data":"8ade27226a2163e9471db289c245657a1eeccd0ff5642100b144b2252989fa2d"} Jan 28 18:56:13 crc kubenswrapper[4749]: I0128 18:56:13.813579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerStarted","Data":"a78f9832d685bb0fcd3bdf9173db5cb564cf3359b95c2bea5f2b936a26aee910"} Jan 28 18:56:14 crc kubenswrapper[4749]: I0128 18:56:14.822955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" event={"ID":"68126ff8-8a74-4130-ba3f-a806dc4a8b2a","Type":"ContainerStarted","Data":"6bb457f7c110f4dff20e8b76a24c3c9c1682944c7d1e1c2e6539e761b121cde2"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.803519 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.842047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2a507d2-d263-4695-96e3-4f7af4761450","Type":"ContainerStarted","Data":"51ae316fbc39ed75ce1554527e47df195568523c65a0f423144dd87f51d0c08f"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.842144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.844166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8gw7" event={"ID":"ca89079d-f8be-4e31-a78d-1c4257260a8f","Type":"ContainerStarted","Data":"ea67b87d3c58e04a2e555f0eaf38a14a5f70a75ea00d6785335472f2e268c252"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.845145 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-g8gw7" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.848461 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7z7wf" event={"ID":"94c11021-07ce-4db3-ae6d-19ff89737e77","Type":"ContainerStarted","Data":"d92a6e0d89d3e3011db9109f8fbba6ff7f4d55aa4946cdc9de3b10a109b4f06b"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.849878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" event={"ID":"8b3ee51e-214c-4efe-875b-51db3beea94c","Type":"ContainerDied","Data":"4b6932209776ad1d9fcd82fa7042bc7335e2b6471ad99dbaf202bcbe57459a9c"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.849910 4749 scope.go:117] "RemoveContainer" containerID="aa9cf9ee1765d865ef0f1f0800263bea431fd671a88a13f2c7fd58e4d3538591" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.850011 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wgmbm" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.856952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nlxrf" event={"ID":"43a4085d-f42a-4a25-8a00-8cfc6c771821","Type":"ContainerStarted","Data":"ffbc2da1a40d6619ec03e1d3dc23443acd68ba8b1744f04ae9402d5c94a82797"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.864671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"886e17a6-463f-46b1-a745-5420d806f7e1","Type":"ContainerStarted","Data":"7b2fbaa2a99286a673405359b4dbe05b783e9f8697995ec5145ec8f4ba2c6ac6"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.864723 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"886e17a6-463f-46b1-a745-5420d806f7e1","Type":"ContainerStarted","Data":"42b88ff9d4e013cd7afcdcf506bf0c35a4d08411801e9553ad77ff55178ecbb4"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.867040 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=21.888977778 podStartE2EDuration="32.867020816s" podCreationTimestamp="2026-01-28 18:55:44 +0000 UTC" firstStartedPulling="2026-01-28 18:55:59.451107238 +0000 UTC m=+1227.462634013" lastFinishedPulling="2026-01-28 18:56:10.429150276 +0000 UTC m=+1238.440677051" observedRunningTime="2026-01-28 18:56:16.855675055 +0000 UTC m=+1244.867201870" watchObservedRunningTime="2026-01-28 18:56:16.867020816 +0000 UTC m=+1244.878547581" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.877945 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-g8gw7" podStartSLOduration=18.755884079 podStartE2EDuration="28.877928197s" podCreationTimestamp="2026-01-28 18:55:48 +0000 UTC" firstStartedPulling="2026-01-28 18:55:59.4790006 +0000 UTC m=+1227.490527375" lastFinishedPulling="2026-01-28 18:56:09.601044728 +0000 UTC m=+1237.612571493" observedRunningTime="2026-01-28 18:56:16.876702976 +0000 UTC m=+1244.888229751" watchObservedRunningTime="2026-01-28 18:56:16.877928197 +0000 UTC m=+1244.889454972" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.883531 4749 generic.go:334] "Generic (PLEG): container finished" podID="df19a7b0-50b1-418f-9822-48f47b2be635" containerID="3413f4d96df0277fa280b9dd23fe07f63586e001d981370c23411031e2d02b46" exitCode=0 Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.900837 4749 generic.go:334] "Generic (PLEG): container finished" podID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerID="6bb457f7c110f4dff20e8b76a24c3c9c1682944c7d1e1c2e6539e761b121cde2" exitCode=0 Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.916388 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.000277897 podStartE2EDuration="30.91636434s" podCreationTimestamp="2026-01-28 18:55:46 +0000 UTC" firstStartedPulling="2026-01-28 18:56:00.614913338 +0000 UTC m=+1228.626440113" lastFinishedPulling="2026-01-28 18:56:09.530999781 +0000 UTC m=+1237.542526556" observedRunningTime="2026-01-28 18:56:16.900291051 +0000 UTC m=+1244.911817836" watchObservedRunningTime="2026-01-28 18:56:16.91636434 +0000 UTC m=+1244.927891105" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.921498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" event={"ID":"df19a7b0-50b1-418f-9822-48f47b2be635","Type":"ContainerDied","Data":"3413f4d96df0277fa280b9dd23fe07f63586e001d981370c23411031e2d02b46"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.921545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" event={"ID":"68126ff8-8a74-4130-ba3f-a806dc4a8b2a","Type":"ContainerDied","Data":"6bb457f7c110f4dff20e8b76a24c3c9c1682944c7d1e1c2e6539e761b121cde2"} Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.928530 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nlxrf" podStartSLOduration=10.087396204 podStartE2EDuration="16.928508711s" podCreationTimestamp="2026-01-28 18:56:00 +0000 UTC" firstStartedPulling="2026-01-28 18:56:09.670880169 +0000 UTC m=+1237.682406944" lastFinishedPulling="2026-01-28 18:56:16.511992666 +0000 UTC m=+1244.523519451" observedRunningTime="2026-01-28 18:56:16.914087293 +0000 UTC m=+1244.925614078" watchObservedRunningTime="2026-01-28 18:56:16.928508711 +0000 UTC m=+1244.940035506" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.930128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-config\") pod \"8b3ee51e-214c-4efe-875b-51db3beea94c\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.930211 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-dns-svc\") pod \"8b3ee51e-214c-4efe-875b-51db3beea94c\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.930304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld5zh\" (UniqueName: \"kubernetes.io/projected/8b3ee51e-214c-4efe-875b-51db3beea94c-kube-api-access-ld5zh\") pod \"8b3ee51e-214c-4efe-875b-51db3beea94c\" (UID: \"8b3ee51e-214c-4efe-875b-51db3beea94c\") " Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.932598 4749 scope.go:117] "RemoveContainer" containerID="a88a4b6f1c8b54108130338b0cfa2f5271487458aceec5c60d2ac14ff2e4747d" Jan 28 18:56:16 crc kubenswrapper[4749]: I0128 18:56:16.950363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3ee51e-214c-4efe-875b-51db3beea94c-kube-api-access-ld5zh" (OuterVolumeSpecName: "kube-api-access-ld5zh") pod "8b3ee51e-214c-4efe-875b-51db3beea94c" (UID: "8b3ee51e-214c-4efe-875b-51db3beea94c"). InnerVolumeSpecName "kube-api-access-ld5zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.032613 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld5zh\" (UniqueName: \"kubernetes.io/projected/8b3ee51e-214c-4efe-875b-51db3beea94c-kube-api-access-ld5zh\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.074855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b3ee51e-214c-4efe-875b-51db3beea94c" (UID: "8b3ee51e-214c-4efe-875b-51db3beea94c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.134523 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.176104 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-config" (OuterVolumeSpecName: "config") pod "8b3ee51e-214c-4efe-875b-51db3beea94c" (UID: "8b3ee51e-214c-4efe-875b-51db3beea94c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.237082 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3ee51e-214c-4efe-875b-51db3beea94c-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.479346 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wgmbm"] Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.486929 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wgmbm"] Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.504499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.911929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5279af8d-ea25-418b-870f-71308c403b44","Type":"ContainerStarted","Data":"76b4d3020c68e4c50b25546922093acd745a801e34873f4af5ab948dc8dcb4fc"} Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.914618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" event={"ID":"df19a7b0-50b1-418f-9822-48f47b2be635","Type":"ContainerStarted","Data":"7e8ff099c9d57108014c9b2becae38bc2bcf503b098dec5d4678d37e7d16956f"} Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.914741 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.916912 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" event={"ID":"68126ff8-8a74-4130-ba3f-a806dc4a8b2a","Type":"ContainerStarted","Data":"78648d519e6211ce1a563e570369fdd79e89a5cffbb3dd087d5a84fc0c978a12"} Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.917025 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.918851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"772600b2-9086-4d72-bb86-6edfb0a21b35","Type":"ContainerStarted","Data":"0bf6c4cd26045235299e00809673399a4195061668568b9db557cc64814ea108"} Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.921405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7z7wf" event={"ID":"94c11021-07ce-4db3-ae6d-19ff89737e77","Type":"ContainerStarted","Data":"b675b43c3d067092567b59e1aac678a97d7c7afe3a9cd13fa7389fb6e1c3e60d"} Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.921550 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.921572 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.924550 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c01d945-dccd-468e-b6ce-d269f1715462" containerID="ba69ea8f43e82495d3d0e7e058d027b50b859ad9b2fa9814bdc66b981b428667" exitCode=0 Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.924627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6c01d945-dccd-468e-b6ce-d269f1715462","Type":"ContainerDied","Data":"ba69ea8f43e82495d3d0e7e058d027b50b859ad9b2fa9814bdc66b981b428667"} Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.926531 4749 generic.go:334] "Generic (PLEG): container finished" podID="7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e" containerID="3df2fe5bca8013ddf9c4561ea063551f8c6439e126a8975585ddbc3344eac82e" exitCode=0 Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.926680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e","Type":"ContainerDied","Data":"3df2fe5bca8013ddf9c4561ea063551f8c6439e126a8975585ddbc3344eac82e"} Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.943962 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.736269435 podStartE2EDuration="27.943942733s" podCreationTimestamp="2026-01-28 18:55:50 +0000 UTC" firstStartedPulling="2026-01-28 18:56:00.61340332 +0000 UTC m=+1228.624930095" lastFinishedPulling="2026-01-28 18:56:16.821076618 +0000 UTC m=+1244.832603393" observedRunningTime="2026-01-28 18:56:17.939208405 +0000 UTC m=+1245.950735200" watchObservedRunningTime="2026-01-28 18:56:17.943942733 +0000 UTC m=+1245.955469508" Jan 28 18:56:17 crc kubenswrapper[4749]: I0128 18:56:17.979123 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7z7wf" podStartSLOduration=20.536741325 podStartE2EDuration="29.979101375s" podCreationTimestamp="2026-01-28 18:55:48 +0000 UTC" firstStartedPulling="2026-01-28 18:55:59.828190035 +0000 UTC m=+1227.839716810" lastFinishedPulling="2026-01-28 18:56:09.270550085 +0000 UTC m=+1237.282076860" observedRunningTime="2026-01-28 18:56:17.969070096 +0000 UTC m=+1245.980596871" watchObservedRunningTime="2026-01-28 18:56:17.979101375 +0000 UTC m=+1245.990628150" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.023305 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" podStartSLOduration=17.023282019 podStartE2EDuration="17.023282019s" podCreationTimestamp="2026-01-28 18:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:56:18.010643526 +0000 UTC m=+1246.022170311" watchObservedRunningTime="2026-01-28 18:56:18.023282019 +0000 UTC m=+1246.034808804" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.032609 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.032700 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.090168 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" podStartSLOduration=17.090144397 podStartE2EDuration="17.090144397s" podCreationTimestamp="2026-01-28 18:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:56:18.083456851 +0000 UTC m=+1246.094983636" watchObservedRunningTime="2026-01-28 18:56:18.090144397 +0000 UTC m=+1246.101671182" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.683149 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.757705 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.883121 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3ee51e-214c-4efe-875b-51db3beea94c" path="/var/lib/kubelet/pods/8b3ee51e-214c-4efe-875b-51db3beea94c/volumes" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.935946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6c01d945-dccd-468e-b6ce-d269f1715462","Type":"ContainerStarted","Data":"5047c610159812fc84374ea1a390a37dc485b02795c160c317a2430b7bd1a41e"} Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.937713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e","Type":"ContainerStarted","Data":"04a439eedb70c77b32d0133c94be766f9e9117a6b7c8bede1a21dd2f785a2b53"} Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.938817 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.966849 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.567199279 podStartE2EDuration="39.966826889s" podCreationTimestamp="2026-01-28 18:55:39 +0000 UTC" firstStartedPulling="2026-01-28 18:55:58.86908404 +0000 UTC m=+1226.880610815" lastFinishedPulling="2026-01-28 18:56:09.26871165 +0000 UTC m=+1237.280238425" observedRunningTime="2026-01-28 18:56:18.959712423 +0000 UTC m=+1246.971239208" watchObservedRunningTime="2026-01-28 18:56:18.966826889 +0000 UTC m=+1246.978353664" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.991220 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.178398722 podStartE2EDuration="37.991196184s" podCreationTimestamp="2026-01-28 18:55:41 +0000 UTC" firstStartedPulling="2026-01-28 18:55:59.457694561 +0000 UTC m=+1227.469221336" lastFinishedPulling="2026-01-28 18:56:09.270492033 +0000 UTC m=+1237.282018798" observedRunningTime="2026-01-28 18:56:18.983976294 +0000 UTC m=+1246.995503079" watchObservedRunningTime="2026-01-28 18:56:18.991196184 +0000 UTC m=+1247.002722959" Jan 28 18:56:18 crc kubenswrapper[4749]: I0128 18:56:18.997515 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 28 18:56:20 crc kubenswrapper[4749]: I0128 18:56:20.953270 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerID="a78f9832d685bb0fcd3bdf9173db5cb564cf3359b95c2bea5f2b936a26aee910" exitCode=0 Jan 28 18:56:20 crc kubenswrapper[4749]: I0128 18:56:20.953358 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerDied","Data":"a78f9832d685bb0fcd3bdf9173db5cb564cf3359b95c2bea5f2b936a26aee910"} Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.069951 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.136527 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.204413 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.204597 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.273159 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 28 18:56:21 crc kubenswrapper[4749]: E0128 18:56:21.273633 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3ee51e-214c-4efe-875b-51db3beea94c" containerName="init" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.273656 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3ee51e-214c-4efe-875b-51db3beea94c" containerName="init" Jan 28 18:56:21 crc kubenswrapper[4749]: E0128 18:56:21.273682 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" containerName="init" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.273689 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" containerName="init" Jan 28 18:56:21 crc kubenswrapper[4749]: E0128 18:56:21.273703 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3ee51e-214c-4efe-875b-51db3beea94c" containerName="dnsmasq-dns" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.273710 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3ee51e-214c-4efe-875b-51db3beea94c" containerName="dnsmasq-dns" Jan 28 18:56:21 crc kubenswrapper[4749]: E0128 18:56:21.273719 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" containerName="dnsmasq-dns" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.273728 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" containerName="dnsmasq-dns" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.273959 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3ee51e-214c-4efe-875b-51db3beea94c" containerName="dnsmasq-dns" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.273994 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1004cb-d7e0-4f3a-9739-33d4afdec629" containerName="dnsmasq-dns" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.275310 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.277864 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.278148 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.278430 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.278562 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-q4dd9" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.314909 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.426598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.426699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-config\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.426752 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.426775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.426845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb94s\" (UniqueName: \"kubernetes.io/projected/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-kube-api-access-vb94s\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.426888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-scripts\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.426949 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.528457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-config\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.528511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.528532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.528599 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb94s\" (UniqueName: \"kubernetes.io/projected/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-kube-api-access-vb94s\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.528616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-scripts\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.528685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.528762 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.529087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.529652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-scripts\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.529735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-config\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.535626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.535882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.536405 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.550538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb94s\" (UniqueName: \"kubernetes.io/projected/1f41ddc7-edf9-4de5-9e85-3a7ec97de75f-kube-api-access-vb94s\") pod \"ovn-northd-0\" (UID: \"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f\") " pod="openstack/ovn-northd-0" Jan 28 18:56:21 crc kubenswrapper[4749]: I0128 18:56:21.608684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 28 18:56:22 crc kubenswrapper[4749]: I0128 18:56:22.124820 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 28 18:56:22 crc kubenswrapper[4749]: W0128 18:56:22.127618 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f41ddc7_edf9_4de5_9e85_3a7ec97de75f.slice/crio-255d79b781d976ff775ff65329c3de3184b824b410a27260c9ee1454d0f8b0a7 WatchSource:0}: Error finding container 255d79b781d976ff775ff65329c3de3184b824b410a27260c9ee1454d0f8b0a7: Status 404 returned error can't find the container with id 255d79b781d976ff775ff65329c3de3184b824b410a27260c9ee1454d0f8b0a7 Jan 28 18:56:22 crc kubenswrapper[4749]: I0128 18:56:22.742411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 28 18:56:22 crc kubenswrapper[4749]: I0128 18:56:22.742781 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 28 18:56:22 crc kubenswrapper[4749]: I0128 18:56:22.977047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f","Type":"ContainerStarted","Data":"255d79b781d976ff775ff65329c3de3184b824b410a27260c9ee1454d0f8b0a7"} Jan 28 18:56:23 crc kubenswrapper[4749]: I0128 18:56:23.613412 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 28 18:56:23 crc kubenswrapper[4749]: I0128 18:56:23.701207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 28 18:56:23 crc kubenswrapper[4749]: I0128 18:56:23.988651 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f","Type":"ContainerStarted","Data":"0d24a5c7e997383e6cb4ad6f6707ebebf944a166f166d869a9bae2eca7248e9d"} Jan 28 18:56:23 crc kubenswrapper[4749]: I0128 18:56:23.988921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1f41ddc7-edf9-4de5-9e85-3a7ec97de75f","Type":"ContainerStarted","Data":"6eee65ee474746e03b1894bf4f28dde6579c02885fe4f7dbf6b4ee6c76c61fe2"} Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.011053 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.6982634490000001 podStartE2EDuration="3.011031782s" podCreationTimestamp="2026-01-28 18:56:21 +0000 UTC" firstStartedPulling="2026-01-28 18:56:22.129417058 +0000 UTC m=+1250.140943833" lastFinishedPulling="2026-01-28 18:56:23.442185391 +0000 UTC m=+1251.453712166" observedRunningTime="2026-01-28 18:56:24.004755966 +0000 UTC m=+1252.016282761" watchObservedRunningTime="2026-01-28 18:56:24.011031782 +0000 UTC m=+1252.022558567" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.534070 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7pqm"] Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.536002 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.563703 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-b604-account-create-update-h4ghn"] Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.566519 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.570981 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.582751 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7pqm"] Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.624516 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2649d\" (UniqueName: \"kubernetes.io/projected/6c146686-53c3-4a99-96fb-db254774ab8f-kube-api-access-2649d\") pod \"mysqld-exporter-openstack-db-create-g7pqm\" (UID: \"6c146686-53c3-4a99-96fb-db254774ab8f\") " pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.624580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c146686-53c3-4a99-96fb-db254774ab8f-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-g7pqm\" (UID: \"6c146686-53c3-4a99-96fb-db254774ab8f\") " pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.624675 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7cpb\" (UniqueName: \"kubernetes.io/projected/4fda79a3-23ee-45f8-a75b-824d790b8304-kube-api-access-r7cpb\") pod \"mysqld-exporter-b604-account-create-update-h4ghn\" (UID: \"4fda79a3-23ee-45f8-a75b-824d790b8304\") " pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.624751 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fda79a3-23ee-45f8-a75b-824d790b8304-operator-scripts\") pod \"mysqld-exporter-b604-account-create-update-h4ghn\" (UID: \"4fda79a3-23ee-45f8-a75b-824d790b8304\") " pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.641226 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-b604-account-create-update-h4ghn"] Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.668034 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-fxd7v"] Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.668274 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" podUID="df19a7b0-50b1-418f-9822-48f47b2be635" containerName="dnsmasq-dns" containerID="cri-o://7e8ff099c9d57108014c9b2becae38bc2bcf503b098dec5d4678d37e7d16956f" gracePeriod=10 Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.670281 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.726286 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7cpb\" (UniqueName: \"kubernetes.io/projected/4fda79a3-23ee-45f8-a75b-824d790b8304-kube-api-access-r7cpb\") pod \"mysqld-exporter-b604-account-create-update-h4ghn\" (UID: \"4fda79a3-23ee-45f8-a75b-824d790b8304\") " pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.726458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fda79a3-23ee-45f8-a75b-824d790b8304-operator-scripts\") pod \"mysqld-exporter-b604-account-create-update-h4ghn\" (UID: \"4fda79a3-23ee-45f8-a75b-824d790b8304\") " pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.726525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2649d\" (UniqueName: \"kubernetes.io/projected/6c146686-53c3-4a99-96fb-db254774ab8f-kube-api-access-2649d\") pod \"mysqld-exporter-openstack-db-create-g7pqm\" (UID: \"6c146686-53c3-4a99-96fb-db254774ab8f\") " pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.726550 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c146686-53c3-4a99-96fb-db254774ab8f-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-g7pqm\" (UID: \"6c146686-53c3-4a99-96fb-db254774ab8f\") " pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.727321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c146686-53c3-4a99-96fb-db254774ab8f-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-g7pqm\" (UID: \"6c146686-53c3-4a99-96fb-db254774ab8f\") " pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.728797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fda79a3-23ee-45f8-a75b-824d790b8304-operator-scripts\") pod \"mysqld-exporter-b604-account-create-update-h4ghn\" (UID: \"4fda79a3-23ee-45f8-a75b-824d790b8304\") " pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.766422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2649d\" (UniqueName: \"kubernetes.io/projected/6c146686-53c3-4a99-96fb-db254774ab8f-kube-api-access-2649d\") pod \"mysqld-exporter-openstack-db-create-g7pqm\" (UID: \"6c146686-53c3-4a99-96fb-db254774ab8f\") " pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.778049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7cpb\" (UniqueName: \"kubernetes.io/projected/4fda79a3-23ee-45f8-a75b-824d790b8304-kube-api-access-r7cpb\") pod \"mysqld-exporter-b604-account-create-update-h4ghn\" (UID: \"4fda79a3-23ee-45f8-a75b-824d790b8304\") " pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.778134 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-7d8jv"] Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.780496 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.803495 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.815509 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7d8jv"] Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.864960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.908821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.934427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-config\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.934540 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.934821 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-dns-svc\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.934846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25sp8\" (UniqueName: \"kubernetes.io/projected/8903650b-b615-47c3-95e7-033bba3b379a-kube-api-access-25sp8\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:24 crc kubenswrapper[4749]: I0128 18:56:24.934894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.027803 4749 generic.go:334] "Generic (PLEG): container finished" podID="df19a7b0-50b1-418f-9822-48f47b2be635" containerID="7e8ff099c9d57108014c9b2becae38bc2bcf503b098dec5d4678d37e7d16956f" exitCode=0 Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.028388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" event={"ID":"df19a7b0-50b1-418f-9822-48f47b2be635","Type":"ContainerDied","Data":"7e8ff099c9d57108014c9b2becae38bc2bcf503b098dec5d4678d37e7d16956f"} Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.028519 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.036789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.036905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-config\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.037414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.037499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-dns-svc\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.038167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-config\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.038302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.038923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.039203 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25sp8\" (UniqueName: \"kubernetes.io/projected/8903650b-b615-47c3-95e7-033bba3b379a-kube-api-access-25sp8\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.039635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-dns-svc\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.062654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25sp8\" (UniqueName: \"kubernetes.io/projected/8903650b-b615-47c3-95e7-033bba3b379a-kube-api-access-25sp8\") pod \"dnsmasq-dns-698758b865-7d8jv\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.124269 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.176005 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.275610 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.280267 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.453732 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2tdl\" (UniqueName: \"kubernetes.io/projected/df19a7b0-50b1-418f-9822-48f47b2be635-kube-api-access-p2tdl\") pod \"df19a7b0-50b1-418f-9822-48f47b2be635\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.454021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-dns-svc\") pod \"df19a7b0-50b1-418f-9822-48f47b2be635\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.454047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-ovsdbserver-nb\") pod \"df19a7b0-50b1-418f-9822-48f47b2be635\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.454364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-config\") pod \"df19a7b0-50b1-418f-9822-48f47b2be635\" (UID: \"df19a7b0-50b1-418f-9822-48f47b2be635\") " Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.488938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df19a7b0-50b1-418f-9822-48f47b2be635-kube-api-access-p2tdl" (OuterVolumeSpecName: "kube-api-access-p2tdl") pod "df19a7b0-50b1-418f-9822-48f47b2be635" (UID: "df19a7b0-50b1-418f-9822-48f47b2be635"). InnerVolumeSpecName "kube-api-access-p2tdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.527149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df19a7b0-50b1-418f-9822-48f47b2be635" (UID: "df19a7b0-50b1-418f-9822-48f47b2be635"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.578432 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2tdl\" (UniqueName: \"kubernetes.io/projected/df19a7b0-50b1-418f-9822-48f47b2be635-kube-api-access-p2tdl\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.578478 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.580894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df19a7b0-50b1-418f-9822-48f47b2be635" (UID: "df19a7b0-50b1-418f-9822-48f47b2be635"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.597378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-config" (OuterVolumeSpecName: "config") pod "df19a7b0-50b1-418f-9822-48f47b2be635" (UID: "df19a7b0-50b1-418f-9822-48f47b2be635"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.680411 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.680454 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df19a7b0-50b1-418f-9822-48f47b2be635-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.782589 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7pqm"] Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.836499 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 28 18:56:25 crc kubenswrapper[4749]: E0128 18:56:25.837400 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df19a7b0-50b1-418f-9822-48f47b2be635" containerName="dnsmasq-dns" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.837420 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="df19a7b0-50b1-418f-9822-48f47b2be635" containerName="dnsmasq-dns" Jan 28 18:56:25 crc kubenswrapper[4749]: E0128 18:56:25.837451 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df19a7b0-50b1-418f-9822-48f47b2be635" containerName="init" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.837458 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="df19a7b0-50b1-418f-9822-48f47b2be635" containerName="init" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.837918 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="df19a7b0-50b1-418f-9822-48f47b2be635" containerName="dnsmasq-dns" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.872976 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-b604-account-create-update-h4ghn"] Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.873099 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.877916 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-j5d4d" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.878016 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.878061 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.878700 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.882753 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.891263 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7d8jv"] Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.985815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.987624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-df0fde5b-5507-4211-82fe-576776c078c8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df0fde5b-5507-4211-82fe-576776c078c8\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.987960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1992d04-5d9f-498f-bee7-f2ab001feb76-lock\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.988042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2bt\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-kube-api-access-vk2bt\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.988062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1992d04-5d9f-498f-bee7-f2ab001feb76-cache\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:25 crc kubenswrapper[4749]: I0128 18:56:25.988213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1992d04-5d9f-498f-bee7-f2ab001feb76-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.044840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" event={"ID":"6c146686-53c3-4a99-96fb-db254774ab8f","Type":"ContainerStarted","Data":"dbd20910ff8efd9a726cccd6a6b98b72130b28729fed30c25ce195cdf04f6f5e"} Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.052249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7d8jv" event={"ID":"8903650b-b615-47c3-95e7-033bba3b379a","Type":"ContainerStarted","Data":"3fd2d56f89cba00975d76856651bbb41c380ba1a5e2e10a6ce07fe0c4efa9b3a"} Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.055427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" event={"ID":"4fda79a3-23ee-45f8-a75b-824d790b8304","Type":"ContainerStarted","Data":"23145a22a8145c83e7d69a7794e2c066c3c9ecd8ac8d2a9ba1cbed0efa8b6045"} Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.062489 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.062880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-fxd7v" event={"ID":"df19a7b0-50b1-418f-9822-48f47b2be635","Type":"ContainerDied","Data":"a19450ec92d3794afc231dd341e69d248658b200ec53990d3af7f33003151dc5"} Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.062970 4749 scope.go:117] "RemoveContainer" containerID="7e8ff099c9d57108014c9b2becae38bc2bcf503b098dec5d4678d37e7d16956f" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.089791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.089857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-df0fde5b-5507-4211-82fe-576776c078c8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df0fde5b-5507-4211-82fe-576776c078c8\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.089927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1992d04-5d9f-498f-bee7-f2ab001feb76-lock\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.089960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2bt\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-kube-api-access-vk2bt\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.089981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1992d04-5d9f-498f-bee7-f2ab001feb76-cache\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: E0128 18:56:26.089989 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 18:56:26 crc kubenswrapper[4749]: E0128 18:56:26.090010 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.090032 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1992d04-5d9f-498f-bee7-f2ab001feb76-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: E0128 18:56:26.090087 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift podName:a1992d04-5d9f-498f-bee7-f2ab001feb76 nodeName:}" failed. No retries permitted until 2026-01-28 18:56:26.59004563 +0000 UTC m=+1254.601572465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift") pod "swift-storage-0" (UID: "a1992d04-5d9f-498f-bee7-f2ab001feb76") : configmap "swift-ring-files" not found Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.091103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1992d04-5d9f-498f-bee7-f2ab001feb76-cache\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.093204 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.093260 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-df0fde5b-5507-4211-82fe-576776c078c8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df0fde5b-5507-4211-82fe-576776c078c8\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/853dd092c7f4893ecb829a7b562c0536a8df74f183352835f8162ce949dc9bc5/globalmount\"" pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.093903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1992d04-5d9f-498f-bee7-f2ab001feb76-lock\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.095027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1992d04-5d9f-498f-bee7-f2ab001feb76-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.095643 4749 scope.go:117] "RemoveContainer" containerID="3413f4d96df0277fa280b9dd23fe07f63586e001d981370c23411031e2d02b46" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.113215 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2bt\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-kube-api-access-vk2bt\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.114502 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-fxd7v"] Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.122263 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-fxd7v"] Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.168177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-df0fde5b-5507-4211-82fe-576776c078c8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df0fde5b-5507-4211-82fe-576776c078c8\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.601208 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:26 crc kubenswrapper[4749]: E0128 18:56:26.601379 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 18:56:26 crc kubenswrapper[4749]: E0128 18:56:26.601615 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 18:56:26 crc kubenswrapper[4749]: E0128 18:56:26.601671 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift podName:a1992d04-5d9f-498f-bee7-f2ab001feb76 nodeName:}" failed. No retries permitted until 2026-01-28 18:56:27.601655022 +0000 UTC m=+1255.613181797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift") pod "swift-storage-0" (UID: "a1992d04-5d9f-498f-bee7-f2ab001feb76") : configmap "swift-ring-files" not found Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.711811 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:26 crc kubenswrapper[4749]: I0128 18:56:26.918131 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df19a7b0-50b1-418f-9822-48f47b2be635" path="/var/lib/kubelet/pods/df19a7b0-50b1-418f-9822-48f47b2be635/volumes" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.090520 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c146686-53c3-4a99-96fb-db254774ab8f" containerID="8cba7203c94e757479a1372a56e863942d63e9ad211e57cf733c8c1103757374" exitCode=0 Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.090917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" event={"ID":"6c146686-53c3-4a99-96fb-db254774ab8f","Type":"ContainerDied","Data":"8cba7203c94e757479a1372a56e863942d63e9ad211e57cf733c8c1103757374"} Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.092407 4749 generic.go:334] "Generic (PLEG): container finished" podID="8903650b-b615-47c3-95e7-033bba3b379a" containerID="759dfa449e6b8880e13e736e121ecf39a19cd9698500bca1e622ccbba25ff5e6" exitCode=0 Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.092447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7d8jv" event={"ID":"8903650b-b615-47c3-95e7-033bba3b379a","Type":"ContainerDied","Data":"759dfa449e6b8880e13e736e121ecf39a19cd9698500bca1e622ccbba25ff5e6"} Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.094162 4749 generic.go:334] "Generic (PLEG): container finished" podID="4fda79a3-23ee-45f8-a75b-824d790b8304" containerID="b0a2e5d15d4e0d8f93a4d664c61b639efd045fdea7c8a5511fce3dee5896d9f4" exitCode=0 Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.094231 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" event={"ID":"4fda79a3-23ee-45f8-a75b-824d790b8304","Type":"ContainerDied","Data":"b0a2e5d15d4e0d8f93a4d664c61b639efd045fdea7c8a5511fce3dee5896d9f4"} Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.626002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:27 crc kubenswrapper[4749]: E0128 18:56:27.626444 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 18:56:27 crc kubenswrapper[4749]: E0128 18:56:27.626990 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 18:56:27 crc kubenswrapper[4749]: E0128 18:56:27.627053 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift podName:a1992d04-5d9f-498f-bee7-f2ab001feb76 nodeName:}" failed. No retries permitted until 2026-01-28 18:56:29.62703369 +0000 UTC m=+1257.638560475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift") pod "swift-storage-0" (UID: "a1992d04-5d9f-498f-bee7-f2ab001feb76") : configmap "swift-ring-files" not found Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.672910 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-sczdj"] Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.674191 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sczdj" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.688667 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sczdj"] Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.810758 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e400-account-create-update-fc5xv"] Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.813110 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.815258 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.829244 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e400-account-create-update-fc5xv"] Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.833712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-operator-scripts\") pod \"glance-db-create-sczdj\" (UID: \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\") " pod="openstack/glance-db-create-sczdj" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.833779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjm7\" (UniqueName: \"kubernetes.io/projected/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-kube-api-access-pfjm7\") pod \"glance-db-create-sczdj\" (UID: \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\") " pod="openstack/glance-db-create-sczdj" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.936274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-operator-scripts\") pod \"glance-db-create-sczdj\" (UID: \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\") " pod="openstack/glance-db-create-sczdj" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.936334 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkl96\" (UniqueName: \"kubernetes.io/projected/f46c02d7-38cd-48a6-acd9-dc45c744ce86-kube-api-access-lkl96\") pod \"glance-e400-account-create-update-fc5xv\" (UID: \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\") " pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.936375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjm7\" (UniqueName: \"kubernetes.io/projected/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-kube-api-access-pfjm7\") pod \"glance-db-create-sczdj\" (UID: \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\") " pod="openstack/glance-db-create-sczdj" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.936449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46c02d7-38cd-48a6-acd9-dc45c744ce86-operator-scripts\") pod \"glance-e400-account-create-update-fc5xv\" (UID: \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\") " pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.937422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-operator-scripts\") pod \"glance-db-create-sczdj\" (UID: \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\") " pod="openstack/glance-db-create-sczdj" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.958635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjm7\" (UniqueName: \"kubernetes.io/projected/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-kube-api-access-pfjm7\") pod \"glance-db-create-sczdj\" (UID: \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\") " pod="openstack/glance-db-create-sczdj" Jan 28 18:56:27 crc kubenswrapper[4749]: I0128 18:56:27.999003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sczdj" Jan 28 18:56:28 crc kubenswrapper[4749]: I0128 18:56:28.040029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkl96\" (UniqueName: \"kubernetes.io/projected/f46c02d7-38cd-48a6-acd9-dc45c744ce86-kube-api-access-lkl96\") pod \"glance-e400-account-create-update-fc5xv\" (UID: \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\") " pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:28 crc kubenswrapper[4749]: I0128 18:56:28.040204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46c02d7-38cd-48a6-acd9-dc45c744ce86-operator-scripts\") pod \"glance-e400-account-create-update-fc5xv\" (UID: \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\") " pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:28 crc kubenswrapper[4749]: I0128 18:56:28.040908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46c02d7-38cd-48a6-acd9-dc45c744ce86-operator-scripts\") pod \"glance-e400-account-create-update-fc5xv\" (UID: \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\") " pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:28 crc kubenswrapper[4749]: I0128 18:56:28.081483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkl96\" (UniqueName: \"kubernetes.io/projected/f46c02d7-38cd-48a6-acd9-dc45c744ce86-kube-api-access-lkl96\") pod \"glance-e400-account-create-update-fc5xv\" (UID: \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\") " pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:28 crc kubenswrapper[4749]: I0128 18:56:28.108146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7d8jv" event={"ID":"8903650b-b615-47c3-95e7-033bba3b379a","Type":"ContainerStarted","Data":"a5e5d017fffb048aeb5ea24f8150a80f5a89277727e4e7310a203ab197fc06b2"} Jan 28 18:56:28 crc kubenswrapper[4749]: I0128 18:56:28.134587 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:28 crc kubenswrapper[4749]: I0128 18:56:28.135532 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-7d8jv" podStartSLOduration=4.135495406 podStartE2EDuration="4.135495406s" podCreationTimestamp="2026-01-28 18:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:56:28.128231785 +0000 UTC m=+1256.139758570" watchObservedRunningTime="2026-01-28 18:56:28.135495406 +0000 UTC m=+1256.147022171" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.116529 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.682720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:32 crc kubenswrapper[4749]: E0128 18:56:29.682952 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 18:56:32 crc kubenswrapper[4749]: E0128 18:56:29.682992 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 18:56:32 crc kubenswrapper[4749]: E0128 18:56:29.683066 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift podName:a1992d04-5d9f-498f-bee7-f2ab001feb76 nodeName:}" failed. No retries permitted until 2026-01-28 18:56:33.683043068 +0000 UTC m=+1261.694569843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift") pod "swift-storage-0" (UID: "a1992d04-5d9f-498f-bee7-f2ab001feb76") : configmap "swift-ring-files" not found Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.728179 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mhwkd"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.729648 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.731661 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.731712 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.731837 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.738201 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mhwkd"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.792308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-combined-ca-bundle\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.792644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-swiftconf\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.792718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-dispersionconf\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.792764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42chm\" (UniqueName: \"kubernetes.io/projected/26e02b25-1356-40b3-b33a-947082d120e0-kube-api-access-42chm\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.792898 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-ring-data-devices\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.793022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-scripts\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.793051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26e02b25-1356-40b3-b33a-947082d120e0-etc-swift\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.798524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5sffp"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.799794 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.803518 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.811556 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5sffp"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.894673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-swiftconf\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.894737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-dispersionconf\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.894781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42chm\" (UniqueName: \"kubernetes.io/projected/26e02b25-1356-40b3-b33a-947082d120e0-kube-api-access-42chm\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.894845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-ring-data-devices\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.895628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-ring-data-devices\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.894930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-scripts\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.895694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26e02b25-1356-40b3-b33a-947082d120e0-etc-swift\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.895730 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfkx2\" (UniqueName: \"kubernetes.io/projected/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-kube-api-access-vfkx2\") pod \"root-account-create-update-5sffp\" (UID: \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\") " pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.895792 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-operator-scripts\") pod \"root-account-create-update-5sffp\" (UID: \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\") " pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.896005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-combined-ca-bundle\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.896144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-scripts\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.897596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26e02b25-1356-40b3-b33a-947082d120e0-etc-swift\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.900889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-swiftconf\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.900976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-combined-ca-bundle\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.902706 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-dispersionconf\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.933040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42chm\" (UniqueName: \"kubernetes.io/projected/26e02b25-1356-40b3-b33a-947082d120e0-kube-api-access-42chm\") pod \"swift-ring-rebalance-mhwkd\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.999665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfkx2\" (UniqueName: \"kubernetes.io/projected/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-kube-api-access-vfkx2\") pod \"root-account-create-update-5sffp\" (UID: \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\") " pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:29.999725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-operator-scripts\") pod \"root-account-create-update-5sffp\" (UID: \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\") " pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:30.000466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-operator-scripts\") pod \"root-account-create-update-5sffp\" (UID: \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\") " pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:30.020803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfkx2\" (UniqueName: \"kubernetes.io/projected/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-kube-api-access-vfkx2\") pod \"root-account-create-update-5sffp\" (UID: \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\") " pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:30.051823 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:30.122101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:31.821774 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-ff46b7d8d-w6bmg" podUID="6e59c97f-819b-46d8-98a2-7a491b0b7c9e" containerName="console" containerID="cri-o://5b0bbb45ed106557a40b2ef270f22245db586d4486855e17b5f77aeef8aa1046" gracePeriod=15 Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.045004 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.096047 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-n2jzp"] Jan 28 18:56:32 crc kubenswrapper[4749]: E0128 18:56:32.096556 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fda79a3-23ee-45f8-a75b-824d790b8304" containerName="mariadb-account-create-update" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.096572 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fda79a3-23ee-45f8-a75b-824d790b8304" containerName="mariadb-account-create-update" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.096827 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fda79a3-23ee-45f8-a75b-824d790b8304" containerName="mariadb-account-create-update" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.097733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.105811 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-n2jzp"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.142925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fda79a3-23ee-45f8-a75b-824d790b8304-operator-scripts\") pod \"4fda79a3-23ee-45f8-a75b-824d790b8304\" (UID: \"4fda79a3-23ee-45f8-a75b-824d790b8304\") " Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.143172 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7cpb\" (UniqueName: \"kubernetes.io/projected/4fda79a3-23ee-45f8-a75b-824d790b8304-kube-api-access-r7cpb\") pod \"4fda79a3-23ee-45f8-a75b-824d790b8304\" (UID: \"4fda79a3-23ee-45f8-a75b-824d790b8304\") " Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.144038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mns5g\" (UniqueName: \"kubernetes.io/projected/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-kube-api-access-mns5g\") pod \"keystone-db-create-n2jzp\" (UID: \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\") " pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.144173 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-operator-scripts\") pod \"keystone-db-create-n2jzp\" (UID: \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\") " pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.145148 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fda79a3-23ee-45f8-a75b-824d790b8304-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fda79a3-23ee-45f8-a75b-824d790b8304" (UID: "4fda79a3-23ee-45f8-a75b-824d790b8304"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.158293 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fda79a3-23ee-45f8-a75b-824d790b8304-kube-api-access-r7cpb" (OuterVolumeSpecName: "kube-api-access-r7cpb") pod "4fda79a3-23ee-45f8-a75b-824d790b8304" (UID: "4fda79a3-23ee-45f8-a75b-824d790b8304"). InnerVolumeSpecName "kube-api-access-r7cpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.181434 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ff46b7d8d-w6bmg_6e59c97f-819b-46d8-98a2-7a491b0b7c9e/console/0.log" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.181480 4749 generic.go:334] "Generic (PLEG): container finished" podID="6e59c97f-819b-46d8-98a2-7a491b0b7c9e" containerID="5b0bbb45ed106557a40b2ef270f22245db586d4486855e17b5f77aeef8aa1046" exitCode=2 Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.181599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff46b7d8d-w6bmg" event={"ID":"6e59c97f-819b-46d8-98a2-7a491b0b7c9e","Type":"ContainerDied","Data":"5b0bbb45ed106557a40b2ef270f22245db586d4486855e17b5f77aeef8aa1046"} Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.194689 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0a33-account-create-update-wx8kt"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.196025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.199906 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.204568 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a33-account-create-update-wx8kt"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.226430 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" event={"ID":"4fda79a3-23ee-45f8-a75b-824d790b8304","Type":"ContainerDied","Data":"23145a22a8145c83e7d69a7794e2c066c3c9ecd8ac8d2a9ba1cbed0efa8b6045"} Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.226762 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23145a22a8145c83e7d69a7794e2c066c3c9ecd8ac8d2a9ba1cbed0efa8b6045" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.226494 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b604-account-create-update-h4ghn" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.249238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af817e85-9839-4aae-afdd-c764fac277a2-operator-scripts\") pod \"keystone-0a33-account-create-update-wx8kt\" (UID: \"af817e85-9839-4aae-afdd-c764fac277a2\") " pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.249495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mns5g\" (UniqueName: \"kubernetes.io/projected/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-kube-api-access-mns5g\") pod \"keystone-db-create-n2jzp\" (UID: \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\") " pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.249575 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-operator-scripts\") pod \"keystone-db-create-n2jzp\" (UID: \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\") " pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.250359 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t29v\" (UniqueName: \"kubernetes.io/projected/af817e85-9839-4aae-afdd-c764fac277a2-kube-api-access-5t29v\") pod \"keystone-0a33-account-create-update-wx8kt\" (UID: \"af817e85-9839-4aae-afdd-c764fac277a2\") " pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.250581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-operator-scripts\") pod \"keystone-db-create-n2jzp\" (UID: \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\") " pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.250781 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fda79a3-23ee-45f8-a75b-824d790b8304-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.250805 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7cpb\" (UniqueName: \"kubernetes.io/projected/4fda79a3-23ee-45f8-a75b-824d790b8304-kube-api-access-r7cpb\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.267423 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mns5g\" (UniqueName: \"kubernetes.io/projected/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-kube-api-access-mns5g\") pod \"keystone-db-create-n2jzp\" (UID: \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\") " pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.348117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.354033 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af817e85-9839-4aae-afdd-c764fac277a2-operator-scripts\") pod \"keystone-0a33-account-create-update-wx8kt\" (UID: \"af817e85-9839-4aae-afdd-c764fac277a2\") " pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.354195 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t29v\" (UniqueName: \"kubernetes.io/projected/af817e85-9839-4aae-afdd-c764fac277a2-kube-api-access-5t29v\") pod \"keystone-0a33-account-create-update-wx8kt\" (UID: \"af817e85-9839-4aae-afdd-c764fac277a2\") " pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.358287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af817e85-9839-4aae-afdd-c764fac277a2-operator-scripts\") pod \"keystone-0a33-account-create-update-wx8kt\" (UID: \"af817e85-9839-4aae-afdd-c764fac277a2\") " pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.378073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t29v\" (UniqueName: \"kubernetes.io/projected/af817e85-9839-4aae-afdd-c764fac277a2-kube-api-access-5t29v\") pod \"keystone-0a33-account-create-update-wx8kt\" (UID: \"af817e85-9839-4aae-afdd-c764fac277a2\") " pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.403894 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jn56l"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.405774 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jn56l" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.427087 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jn56l"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.455979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f07580-cd44-43b4-a459-69d3984e1c09-operator-scripts\") pod \"placement-db-create-jn56l\" (UID: \"a4f07580-cd44-43b4-a459-69d3984e1c09\") " pod="openstack/placement-db-create-jn56l" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.456177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwf2r\" (UniqueName: \"kubernetes.io/projected/a4f07580-cd44-43b4-a459-69d3984e1c09-kube-api-access-kwf2r\") pod \"placement-db-create-jn56l\" (UID: \"a4f07580-cd44-43b4-a459-69d3984e1c09\") " pod="openstack/placement-db-create-jn56l" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.505935 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-80e6-account-create-update-zqbb9"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.507615 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.510190 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.540485 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-80e6-account-create-update-zqbb9"] Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.559037 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4ptx\" (UniqueName: \"kubernetes.io/projected/93549c28-1d00-40bc-bddc-3e0d93b913f2-kube-api-access-l4ptx\") pod \"placement-80e6-account-create-update-zqbb9\" (UID: \"93549c28-1d00-40bc-bddc-3e0d93b913f2\") " pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.559121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f07580-cd44-43b4-a459-69d3984e1c09-operator-scripts\") pod \"placement-db-create-jn56l\" (UID: \"a4f07580-cd44-43b4-a459-69d3984e1c09\") " pod="openstack/placement-db-create-jn56l" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.559243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwf2r\" (UniqueName: \"kubernetes.io/projected/a4f07580-cd44-43b4-a459-69d3984e1c09-kube-api-access-kwf2r\") pod \"placement-db-create-jn56l\" (UID: \"a4f07580-cd44-43b4-a459-69d3984e1c09\") " pod="openstack/placement-db-create-jn56l" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.559280 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93549c28-1d00-40bc-bddc-3e0d93b913f2-operator-scripts\") pod \"placement-80e6-account-create-update-zqbb9\" (UID: \"93549c28-1d00-40bc-bddc-3e0d93b913f2\") " pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.560452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f07580-cd44-43b4-a459-69d3984e1c09-operator-scripts\") pod \"placement-db-create-jn56l\" (UID: \"a4f07580-cd44-43b4-a459-69d3984e1c09\") " pod="openstack/placement-db-create-jn56l" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.582750 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwf2r\" (UniqueName: \"kubernetes.io/projected/a4f07580-cd44-43b4-a459-69d3984e1c09-kube-api-access-kwf2r\") pod \"placement-db-create-jn56l\" (UID: \"a4f07580-cd44-43b4-a459-69d3984e1c09\") " pod="openstack/placement-db-create-jn56l" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.660319 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.663013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93549c28-1d00-40bc-bddc-3e0d93b913f2-operator-scripts\") pod \"placement-80e6-account-create-update-zqbb9\" (UID: \"93549c28-1d00-40bc-bddc-3e0d93b913f2\") " pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.663249 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ptx\" (UniqueName: \"kubernetes.io/projected/93549c28-1d00-40bc-bddc-3e0d93b913f2-kube-api-access-l4ptx\") pod \"placement-80e6-account-create-update-zqbb9\" (UID: \"93549c28-1d00-40bc-bddc-3e0d93b913f2\") " pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.663784 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93549c28-1d00-40bc-bddc-3e0d93b913f2-operator-scripts\") pod \"placement-80e6-account-create-update-zqbb9\" (UID: \"93549c28-1d00-40bc-bddc-3e0d93b913f2\") " pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.686909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4ptx\" (UniqueName: \"kubernetes.io/projected/93549c28-1d00-40bc-bddc-3e0d93b913f2-kube-api-access-l4ptx\") pod \"placement-80e6-account-create-update-zqbb9\" (UID: \"93549c28-1d00-40bc-bddc-3e0d93b913f2\") " pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.730507 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jn56l" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.800077 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.829699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.929457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c146686-53c3-4a99-96fb-db254774ab8f-operator-scripts\") pod \"6c146686-53c3-4a99-96fb-db254774ab8f\" (UID: \"6c146686-53c3-4a99-96fb-db254774ab8f\") " Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.930029 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2649d\" (UniqueName: \"kubernetes.io/projected/6c146686-53c3-4a99-96fb-db254774ab8f-kube-api-access-2649d\") pod \"6c146686-53c3-4a99-96fb-db254774ab8f\" (UID: \"6c146686-53c3-4a99-96fb-db254774ab8f\") " Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.930829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c146686-53c3-4a99-96fb-db254774ab8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c146686-53c3-4a99-96fb-db254774ab8f" (UID: "6c146686-53c3-4a99-96fb-db254774ab8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.936182 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c146686-53c3-4a99-96fb-db254774ab8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:32 crc kubenswrapper[4749]: I0128 18:56:32.956094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c146686-53c3-4a99-96fb-db254774ab8f-kube-api-access-2649d" (OuterVolumeSpecName: "kube-api-access-2649d") pod "6c146686-53c3-4a99-96fb-db254774ab8f" (UID: "6c146686-53c3-4a99-96fb-db254774ab8f"). InnerVolumeSpecName "kube-api-access-2649d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.048198 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2649d\" (UniqueName: \"kubernetes.io/projected/6c146686-53c3-4a99-96fb-db254774ab8f-kube-api-access-2649d\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.105133 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ff46b7d8d-w6bmg_6e59c97f-819b-46d8-98a2-7a491b0b7c9e/console/0.log" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.105530 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.239128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerStarted","Data":"67106dc4864cfe613cad5fbd43a0e263e8a5d0985918253bff078189ad45e2e4"} Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.240952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" event={"ID":"6c146686-53c3-4a99-96fb-db254774ab8f","Type":"ContainerDied","Data":"dbd20910ff8efd9a726cccd6a6b98b72130b28729fed30c25ce195cdf04f6f5e"} Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.240974 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbd20910ff8efd9a726cccd6a6b98b72130b28729fed30c25ce195cdf04f6f5e" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.241042 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-g7pqm" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.247270 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ff46b7d8d-w6bmg_6e59c97f-819b-46d8-98a2-7a491b0b7c9e/console/0.log" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.247347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff46b7d8d-w6bmg" event={"ID":"6e59c97f-819b-46d8-98a2-7a491b0b7c9e","Type":"ContainerDied","Data":"58829424ca0b0a1ecfb9cf838ba3b4c868e12368852dcfdc7d7bbdd302d4b192"} Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.247441 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff46b7d8d-w6bmg" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.247718 4749 scope.go:117] "RemoveContainer" containerID="5b0bbb45ed106557a40b2ef270f22245db586d4486855e17b5f77aeef8aa1046" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.252221 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6rh2\" (UniqueName: \"kubernetes.io/projected/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-kube-api-access-p6rh2\") pod \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.252309 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-serving-cert\") pod \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.252383 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-trusted-ca-bundle\") pod \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.252521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-config\") pod \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.252585 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-service-ca\") pod \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.252608 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-oauth-config\") pod \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.252624 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-oauth-serving-cert\") pod \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\" (UID: \"6e59c97f-819b-46d8-98a2-7a491b0b7c9e\") " Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.253363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6e59c97f-819b-46d8-98a2-7a491b0b7c9e" (UID: "6e59c97f-819b-46d8-98a2-7a491b0b7c9e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.253572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6e59c97f-819b-46d8-98a2-7a491b0b7c9e" (UID: "6e59c97f-819b-46d8-98a2-7a491b0b7c9e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.253950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-service-ca" (OuterVolumeSpecName: "service-ca") pod "6e59c97f-819b-46d8-98a2-7a491b0b7c9e" (UID: "6e59c97f-819b-46d8-98a2-7a491b0b7c9e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.254117 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-config" (OuterVolumeSpecName: "console-config") pod "6e59c97f-819b-46d8-98a2-7a491b0b7c9e" (UID: "6e59c97f-819b-46d8-98a2-7a491b0b7c9e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.259706 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6e59c97f-819b-46d8-98a2-7a491b0b7c9e" (UID: "6e59c97f-819b-46d8-98a2-7a491b0b7c9e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.259851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-kube-api-access-p6rh2" (OuterVolumeSpecName: "kube-api-access-p6rh2") pod "6e59c97f-819b-46d8-98a2-7a491b0b7c9e" (UID: "6e59c97f-819b-46d8-98a2-7a491b0b7c9e"). InnerVolumeSpecName "kube-api-access-p6rh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.267231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6e59c97f-819b-46d8-98a2-7a491b0b7c9e" (UID: "6e59c97f-819b-46d8-98a2-7a491b0b7c9e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.355248 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.355284 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.355297 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.355309 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.355338 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6rh2\" (UniqueName: \"kubernetes.io/projected/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-kube-api-access-p6rh2\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.355352 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.355362 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e59c97f-819b-46d8-98a2-7a491b0b7c9e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.458600 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5sffp"] Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.466502 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.472320 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e400-account-create-update-fc5xv"] Jan 28 18:56:33 crc kubenswrapper[4749]: W0128 18:56:33.477990 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26e02b25_1356_40b3_b33a_947082d120e0.slice/crio-5ea06e475210f467037ea1e36b441637b13b5adb3a4d32f7fa8cc59d75c57b7f WatchSource:0}: Error finding container 5ea06e475210f467037ea1e36b441637b13b5adb3a4d32f7fa8cc59d75c57b7f: Status 404 returned error can't find the container with id 5ea06e475210f467037ea1e36b441637b13b5adb3a4d32f7fa8cc59d75c57b7f Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.485115 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.487037 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mhwkd"] Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.488892 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.495630 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-n2jzp"] Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.502965 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sczdj"] Jan 28 18:56:33 crc kubenswrapper[4749]: W0128 18:56:33.510587 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5966b5d5_e9e8_4c91_b8cf_8253a8dee0ef.slice/crio-dc52dbdd7782948aa08b1b642b6de87f8f2731a47c21c2ac94b058b49d393c29 WatchSource:0}: Error finding container dc52dbdd7782948aa08b1b642b6de87f8f2731a47c21c2ac94b058b49d393c29: Status 404 returned error can't find the container with id dc52dbdd7782948aa08b1b642b6de87f8f2731a47c21c2ac94b058b49d393c29 Jan 28 18:56:33 crc kubenswrapper[4749]: E0128 18:56:33.579075 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c146686_53c3_4a99_96fb_db254774ab8f.slice\": RecentStats: unable to find data in memory cache]" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.615239 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ff46b7d8d-w6bmg"] Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.634452 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-ff46b7d8d-w6bmg"] Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.771164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:33 crc kubenswrapper[4749]: E0128 18:56:33.771880 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 18:56:33 crc kubenswrapper[4749]: E0128 18:56:33.771905 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 18:56:33 crc kubenswrapper[4749]: E0128 18:56:33.771951 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift podName:a1992d04-5d9f-498f-bee7-f2ab001feb76 nodeName:}" failed. No retries permitted until 2026-01-28 18:56:41.771937479 +0000 UTC m=+1269.783464254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift") pod "swift-storage-0" (UID: "a1992d04-5d9f-498f-bee7-f2ab001feb76") : configmap "swift-ring-files" not found Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.804092 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a33-account-create-update-wx8kt"] Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.812771 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jn56l"] Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.821505 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-80e6-account-create-update-zqbb9"] Jan 28 18:56:33 crc kubenswrapper[4749]: W0128 18:56:33.877088 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf817e85_9839_4aae_afdd_c764fac277a2.slice/crio-da82686f6ba7ab747f2ad95fcbba546e6637301af6b374497baddcbf86721436 WatchSource:0}: Error finding container da82686f6ba7ab747f2ad95fcbba546e6637301af6b374497baddcbf86721436: Status 404 returned error can't find the container with id da82686f6ba7ab747f2ad95fcbba546e6637301af6b374497baddcbf86721436 Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.891582 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 28 18:56:33 crc kubenswrapper[4749]: W0128 18:56:33.899148 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93549c28_1d00_40bc_bddc_3e0d93b913f2.slice/crio-cd66ae25490012941c0abb330f742a01de28a0830b399b4d105c6a9d772e920c WatchSource:0}: Error finding container cd66ae25490012941c0abb330f742a01de28a0830b399b4d105c6a9d772e920c: Status 404 returned error can't find the container with id cd66ae25490012941c0abb330f742a01de28a0830b399b4d105c6a9d772e920c Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.913555 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.939909 4749 patch_prober.go:28] interesting pod/console-ff46b7d8d-w6bmg container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.90:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 18:56:33 crc kubenswrapper[4749]: I0128 18:56:33.939979 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-ff46b7d8d-w6bmg" podUID="6e59c97f-819b-46d8-98a2-7a491b0b7c9e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.90:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.260973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jn56l" event={"ID":"a4f07580-cd44-43b4-a459-69d3984e1c09","Type":"ContainerStarted","Data":"cc5b0e81a030d2a95426ef6701fbc376a7d32b2ed769dbd5cc184f263acf7984"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.261046 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jn56l" event={"ID":"a4f07580-cd44-43b4-a459-69d3984e1c09","Type":"ContainerStarted","Data":"6306315ed1f1d1c5697ca3d4dc69e9b6e565059656cc3989f04c57f3acfee2f0"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.267464 4749 generic.go:334] "Generic (PLEG): container finished" podID="a4fc78d3-bd48-48df-b646-a7d44ed0bd3b" containerID="fecc9b27f86173262c43337af2e073e6deccf7a3304227458e32e61c10813aff" exitCode=0 Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.267559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sczdj" event={"ID":"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b","Type":"ContainerDied","Data":"fecc9b27f86173262c43337af2e073e6deccf7a3304227458e32e61c10813aff"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.267593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sczdj" event={"ID":"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b","Type":"ContainerStarted","Data":"bee1a959090fd640d073bd8b4ebc30b4fdc71856cd65e681e2d39a624643324b"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.269791 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc6f4e87-13b0-43ef-8a0c-d94025d5af0a" containerID="a0f54d4dc507ebe673b6600c741bb104f6fa42c3ddaa2b238abe945fd456c0fb" exitCode=0 Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.269894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5sffp" event={"ID":"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a","Type":"ContainerDied","Data":"a0f54d4dc507ebe673b6600c741bb104f6fa42c3ddaa2b238abe945fd456c0fb"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.269926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5sffp" event={"ID":"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a","Type":"ContainerStarted","Data":"f6bd2bd22d4f0559cde3af07acb7141a61e0d8e61e0f3239fce863baa0b1a10e"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.271802 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a33-account-create-update-wx8kt" event={"ID":"af817e85-9839-4aae-afdd-c764fac277a2","Type":"ContainerStarted","Data":"da82686f6ba7ab747f2ad95fcbba546e6637301af6b374497baddcbf86721436"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.278237 4749 generic.go:334] "Generic (PLEG): container finished" podID="5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef" containerID="d1b221d137ac5f4864550ffa146b1280923a550d9ba346d8b4cdc6eb65461619" exitCode=0 Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.278398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n2jzp" event={"ID":"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef","Type":"ContainerDied","Data":"d1b221d137ac5f4864550ffa146b1280923a550d9ba346d8b4cdc6eb65461619"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.278420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n2jzp" event={"ID":"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef","Type":"ContainerStarted","Data":"dc52dbdd7782948aa08b1b642b6de87f8f2731a47c21c2ac94b058b49d393c29"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.280241 4749 generic.go:334] "Generic (PLEG): container finished" podID="f46c02d7-38cd-48a6-acd9-dc45c744ce86" containerID="35ff58276818174f3329f6adecd8eb4e223c2c973c27e67b6953911af8dc5661" exitCode=0 Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.280305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e400-account-create-update-fc5xv" event={"ID":"f46c02d7-38cd-48a6-acd9-dc45c744ce86","Type":"ContainerDied","Data":"35ff58276818174f3329f6adecd8eb4e223c2c973c27e67b6953911af8dc5661"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.280355 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e400-account-create-update-fc5xv" event={"ID":"f46c02d7-38cd-48a6-acd9-dc45c744ce86","Type":"ContainerStarted","Data":"b8f262c89d552783137ef2fedfd1185ef357300815fa7324d560697862963fca"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.284523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-80e6-account-create-update-zqbb9" event={"ID":"93549c28-1d00-40bc-bddc-3e0d93b913f2","Type":"ContainerStarted","Data":"cd66ae25490012941c0abb330f742a01de28a0830b399b4d105c6a9d772e920c"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.286141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mhwkd" event={"ID":"26e02b25-1356-40b3-b33a-947082d120e0","Type":"ContainerStarted","Data":"5ea06e475210f467037ea1e36b441637b13b5adb3a4d32f7fa8cc59d75c57b7f"} Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.300990 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-jn56l" podStartSLOduration=2.300950033 podStartE2EDuration="2.300950033s" podCreationTimestamp="2026-01-28 18:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:56:34.278300841 +0000 UTC m=+1262.289827616" watchObservedRunningTime="2026-01-28 18:56:34.300950033 +0000 UTC m=+1262.312476808" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.764423 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-znzw7"] Jan 28 18:56:34 crc kubenswrapper[4749]: E0128 18:56:34.765149 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c146686-53c3-4a99-96fb-db254774ab8f" containerName="mariadb-database-create" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.765161 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c146686-53c3-4a99-96fb-db254774ab8f" containerName="mariadb-database-create" Jan 28 18:56:34 crc kubenswrapper[4749]: E0128 18:56:34.765185 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e59c97f-819b-46d8-98a2-7a491b0b7c9e" containerName="console" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.765190 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e59c97f-819b-46d8-98a2-7a491b0b7c9e" containerName="console" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.765411 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c146686-53c3-4a99-96fb-db254774ab8f" containerName="mariadb-database-create" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.765430 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e59c97f-819b-46d8-98a2-7a491b0b7c9e" containerName="console" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.766068 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.774188 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-znzw7"] Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.887562 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e59c97f-819b-46d8-98a2-7a491b0b7c9e" path="/var/lib/kubelet/pods/6e59c97f-819b-46d8-98a2-7a491b0b7c9e/volumes" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.919763 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5m2c\" (UniqueName: \"kubernetes.io/projected/094de6d7-0ad5-4985-9894-b004751d377b-kube-api-access-q5m2c\") pod \"mysqld-exporter-openstack-cell1-db-create-znzw7\" (UID: \"094de6d7-0ad5-4985-9894-b004751d377b\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.920122 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/094de6d7-0ad5-4985-9894-b004751d377b-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-znzw7\" (UID: \"094de6d7-0ad5-4985-9894-b004751d377b\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.968727 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-fc3a-account-create-update-jwst7"] Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.970157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.971691 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Jan 28 18:56:34 crc kubenswrapper[4749]: I0128 18:56:34.978551 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-fc3a-account-create-update-jwst7"] Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.022261 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5gbx\" (UniqueName: \"kubernetes.io/projected/ea422f17-7b59-48ad-8434-557e5c0a6096-kube-api-access-j5gbx\") pod \"mysqld-exporter-fc3a-account-create-update-jwst7\" (UID: \"ea422f17-7b59-48ad-8434-557e5c0a6096\") " pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.022508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea422f17-7b59-48ad-8434-557e5c0a6096-operator-scripts\") pod \"mysqld-exporter-fc3a-account-create-update-jwst7\" (UID: \"ea422f17-7b59-48ad-8434-557e5c0a6096\") " pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.022648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/094de6d7-0ad5-4985-9894-b004751d377b-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-znzw7\" (UID: \"094de6d7-0ad5-4985-9894-b004751d377b\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.023534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/094de6d7-0ad5-4985-9894-b004751d377b-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-znzw7\" (UID: \"094de6d7-0ad5-4985-9894-b004751d377b\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.023526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5m2c\" (UniqueName: \"kubernetes.io/projected/094de6d7-0ad5-4985-9894-b004751d377b-kube-api-access-q5m2c\") pod \"mysqld-exporter-openstack-cell1-db-create-znzw7\" (UID: \"094de6d7-0ad5-4985-9894-b004751d377b\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.128194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea422f17-7b59-48ad-8434-557e5c0a6096-operator-scripts\") pod \"mysqld-exporter-fc3a-account-create-update-jwst7\" (UID: \"ea422f17-7b59-48ad-8434-557e5c0a6096\") " pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.128897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea422f17-7b59-48ad-8434-557e5c0a6096-operator-scripts\") pod \"mysqld-exporter-fc3a-account-create-update-jwst7\" (UID: \"ea422f17-7b59-48ad-8434-557e5c0a6096\") " pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.129083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5gbx\" (UniqueName: \"kubernetes.io/projected/ea422f17-7b59-48ad-8434-557e5c0a6096-kube-api-access-j5gbx\") pod \"mysqld-exporter-fc3a-account-create-update-jwst7\" (UID: \"ea422f17-7b59-48ad-8434-557e5c0a6096\") " pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.178646 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.245046 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s9fdf"] Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.245353 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" podUID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerName="dnsmasq-dns" containerID="cri-o://78648d519e6211ce1a563e570369fdd79e89a5cffbb3dd087d5a84fc0c978a12" gracePeriod=10 Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.257548 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5m2c\" (UniqueName: \"kubernetes.io/projected/094de6d7-0ad5-4985-9894-b004751d377b-kube-api-access-q5m2c\") pod \"mysqld-exporter-openstack-cell1-db-create-znzw7\" (UID: \"094de6d7-0ad5-4985-9894-b004751d377b\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.258705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5gbx\" (UniqueName: \"kubernetes.io/projected/ea422f17-7b59-48ad-8434-557e5c0a6096-kube-api-access-j5gbx\") pod \"mysqld-exporter-fc3a-account-create-update-jwst7\" (UID: \"ea422f17-7b59-48ad-8434-557e5c0a6096\") " pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.290983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.314835 4749 generic.go:334] "Generic (PLEG): container finished" podID="af817e85-9839-4aae-afdd-c764fac277a2" containerID="a416d2ce979da07dfebefd888927a8f891995745288db76de51ae3127f21542e" exitCode=0 Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.315490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a33-account-create-update-wx8kt" event={"ID":"af817e85-9839-4aae-afdd-c764fac277a2","Type":"ContainerDied","Data":"a416d2ce979da07dfebefd888927a8f891995745288db76de51ae3127f21542e"} Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.322172 4749 generic.go:334] "Generic (PLEG): container finished" podID="a4f07580-cd44-43b4-a459-69d3984e1c09" containerID="cc5b0e81a030d2a95426ef6701fbc376a7d32b2ed769dbd5cc184f263acf7984" exitCode=0 Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.322223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jn56l" event={"ID":"a4f07580-cd44-43b4-a459-69d3984e1c09","Type":"ContainerDied","Data":"cc5b0e81a030d2a95426ef6701fbc376a7d32b2ed769dbd5cc184f263acf7984"} Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.341532 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-80e6-account-create-update-zqbb9" event={"ID":"93549c28-1d00-40bc-bddc-3e0d93b913f2","Type":"ContainerStarted","Data":"cd9352357a3a075908a05182180270262d056cc6d3b4e4c60181b6a14c3e7336"} Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.386525 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-80e6-account-create-update-zqbb9" podStartSLOduration=3.386508353 podStartE2EDuration="3.386508353s" podCreationTimestamp="2026-01-28 18:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:56:35.377621143 +0000 UTC m=+1263.389147918" watchObservedRunningTime="2026-01-28 18:56:35.386508353 +0000 UTC m=+1263.398035128" Jan 28 18:56:35 crc kubenswrapper[4749]: I0128 18:56:35.480831 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.282217 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sczdj" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.288825 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.331049 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.331395 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.362078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfjm7\" (UniqueName: \"kubernetes.io/projected/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-kube-api-access-pfjm7\") pod \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\" (UID: \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\") " Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.362151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-operator-scripts\") pod \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\" (UID: \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\") " Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.362213 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfkx2\" (UniqueName: \"kubernetes.io/projected/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-kube-api-access-vfkx2\") pod \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\" (UID: \"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a\") " Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.362234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-operator-scripts\") pod \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\" (UID: \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\") " Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.362260 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkl96\" (UniqueName: \"kubernetes.io/projected/f46c02d7-38cd-48a6-acd9-dc45c744ce86-kube-api-access-lkl96\") pod \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\" (UID: \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\") " Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.362292 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mns5g\" (UniqueName: \"kubernetes.io/projected/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-kube-api-access-mns5g\") pod \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\" (UID: \"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef\") " Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.362311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-operator-scripts\") pod \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\" (UID: \"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b\") " Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.362357 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46c02d7-38cd-48a6-acd9-dc45c744ce86-operator-scripts\") pod \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\" (UID: \"f46c02d7-38cd-48a6-acd9-dc45c744ce86\") " Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.366918 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef" (UID: "5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.367793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46c02d7-38cd-48a6-acd9-dc45c744ce86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f46c02d7-38cd-48a6-acd9-dc45c744ce86" (UID: "f46c02d7-38cd-48a6-acd9-dc45c744ce86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.369316 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc6f4e87-13b0-43ef-8a0c-d94025d5af0a" (UID: "fc6f4e87-13b0-43ef-8a0c-d94025d5af0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.375771 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4fc78d3-bd48-48df-b646-a7d44ed0bd3b" (UID: "a4fc78d3-bd48-48df-b646-a7d44ed0bd3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.376175 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-kube-api-access-mns5g" (OuterVolumeSpecName: "kube-api-access-mns5g") pod "5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef" (UID: "5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef"). InnerVolumeSpecName "kube-api-access-mns5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.377015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-kube-api-access-pfjm7" (OuterVolumeSpecName: "kube-api-access-pfjm7") pod "a4fc78d3-bd48-48df-b646-a7d44ed0bd3b" (UID: "a4fc78d3-bd48-48df-b646-a7d44ed0bd3b"). InnerVolumeSpecName "kube-api-access-pfjm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.383499 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46c02d7-38cd-48a6-acd9-dc45c744ce86-kube-api-access-lkl96" (OuterVolumeSpecName: "kube-api-access-lkl96") pod "f46c02d7-38cd-48a6-acd9-dc45c744ce86" (UID: "f46c02d7-38cd-48a6-acd9-dc45c744ce86"). InnerVolumeSpecName "kube-api-access-lkl96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.383904 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-kube-api-access-vfkx2" (OuterVolumeSpecName: "kube-api-access-vfkx2") pod "fc6f4e87-13b0-43ef-8a0c-d94025d5af0a" (UID: "fc6f4e87-13b0-43ef-8a0c-d94025d5af0a"). InnerVolumeSpecName "kube-api-access-vfkx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.405370 4749 generic.go:334] "Generic (PLEG): container finished" podID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerID="78648d519e6211ce1a563e570369fdd79e89a5cffbb3dd087d5a84fc0c978a12" exitCode=0 Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.405473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" event={"ID":"68126ff8-8a74-4130-ba3f-a806dc4a8b2a","Type":"ContainerDied","Data":"78648d519e6211ce1a563e570369fdd79e89a5cffbb3dd087d5a84fc0c978a12"} Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.408178 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sczdj" event={"ID":"a4fc78d3-bd48-48df-b646-a7d44ed0bd3b","Type":"ContainerDied","Data":"bee1a959090fd640d073bd8b4ebc30b4fdc71856cd65e681e2d39a624643324b"} Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.408216 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bee1a959090fd640d073bd8b4ebc30b4fdc71856cd65e681e2d39a624643324b" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.408453 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sczdj" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.410803 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n2jzp" event={"ID":"5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef","Type":"ContainerDied","Data":"dc52dbdd7782948aa08b1b642b6de87f8f2731a47c21c2ac94b058b49d393c29"} Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.410824 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc52dbdd7782948aa08b1b642b6de87f8f2731a47c21c2ac94b058b49d393c29" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.410872 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n2jzp" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.425745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e400-account-create-update-fc5xv" event={"ID":"f46c02d7-38cd-48a6-acd9-dc45c744ce86","Type":"ContainerDied","Data":"b8f262c89d552783137ef2fedfd1185ef357300815fa7324d560697862963fca"} Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.425782 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8f262c89d552783137ef2fedfd1185ef357300815fa7324d560697862963fca" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.425863 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e400-account-create-update-fc5xv" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.443406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5sffp" event={"ID":"fc6f4e87-13b0-43ef-8a0c-d94025d5af0a","Type":"ContainerDied","Data":"f6bd2bd22d4f0559cde3af07acb7141a61e0d8e61e0f3239fce863baa0b1a10e"} Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.443447 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6bd2bd22d4f0559cde3af07acb7141a61e0d8e61e0f3239fce863baa0b1a10e" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.443515 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5sffp" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.451585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerStarted","Data":"2fcdbe63b928fbeef4417b8237eb37201ff476197135cdd08ee936f92f047061"} Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.458239 4749 generic.go:334] "Generic (PLEG): container finished" podID="93549c28-1d00-40bc-bddc-3e0d93b913f2" containerID="cd9352357a3a075908a05182180270262d056cc6d3b4e4c60181b6a14c3e7336" exitCode=0 Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.458351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-80e6-account-create-update-zqbb9" event={"ID":"93549c28-1d00-40bc-bddc-3e0d93b913f2","Type":"ContainerDied","Data":"cd9352357a3a075908a05182180270262d056cc6d3b4e4c60181b6a14c3e7336"} Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.464342 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfjm7\" (UniqueName: \"kubernetes.io/projected/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-kube-api-access-pfjm7\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.464368 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.464377 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfkx2\" (UniqueName: \"kubernetes.io/projected/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a-kube-api-access-vfkx2\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.464386 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.464396 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkl96\" (UniqueName: \"kubernetes.io/projected/f46c02d7-38cd-48a6-acd9-dc45c744ce86-kube-api-access-lkl96\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.464406 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mns5g\" (UniqueName: \"kubernetes.io/projected/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef-kube-api-access-mns5g\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.464415 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:36 crc kubenswrapper[4749]: I0128 18:56:36.464423 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46c02d7-38cd-48a6-acd9-dc45c744ce86-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.773037 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.790811 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.801214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jn56l" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898230 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af817e85-9839-4aae-afdd-c764fac277a2-operator-scripts\") pod \"af817e85-9839-4aae-afdd-c764fac277a2\" (UID: \"af817e85-9839-4aae-afdd-c764fac277a2\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898299 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbj2b\" (UniqueName: \"kubernetes.io/projected/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-kube-api-access-nbj2b\") pod \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-sb\") pod \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898477 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t29v\" (UniqueName: \"kubernetes.io/projected/af817e85-9839-4aae-afdd-c764fac277a2-kube-api-access-5t29v\") pod \"af817e85-9839-4aae-afdd-c764fac277a2\" (UID: \"af817e85-9839-4aae-afdd-c764fac277a2\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898590 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f07580-cd44-43b4-a459-69d3984e1c09-operator-scripts\") pod \"a4f07580-cd44-43b4-a459-69d3984e1c09\" (UID: \"a4f07580-cd44-43b4-a459-69d3984e1c09\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwf2r\" (UniqueName: \"kubernetes.io/projected/a4f07580-cd44-43b4-a459-69d3984e1c09-kube-api-access-kwf2r\") pod \"a4f07580-cd44-43b4-a459-69d3984e1c09\" (UID: \"a4f07580-cd44-43b4-a459-69d3984e1c09\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898720 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-dns-svc\") pod \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-config\") pod \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.898822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-nb\") pod \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\" (UID: \"68126ff8-8a74-4130-ba3f-a806dc4a8b2a\") " Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.899973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f07580-cd44-43b4-a459-69d3984e1c09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4f07580-cd44-43b4-a459-69d3984e1c09" (UID: "a4f07580-cd44-43b4-a459-69d3984e1c09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.900527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af817e85-9839-4aae-afdd-c764fac277a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af817e85-9839-4aae-afdd-c764fac277a2" (UID: "af817e85-9839-4aae-afdd-c764fac277a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.906414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f07580-cd44-43b4-a459-69d3984e1c09-kube-api-access-kwf2r" (OuterVolumeSpecName: "kube-api-access-kwf2r") pod "a4f07580-cd44-43b4-a459-69d3984e1c09" (UID: "a4f07580-cd44-43b4-a459-69d3984e1c09"). InnerVolumeSpecName "kube-api-access-kwf2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.914205 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-kube-api-access-nbj2b" (OuterVolumeSpecName: "kube-api-access-nbj2b") pod "68126ff8-8a74-4130-ba3f-a806dc4a8b2a" (UID: "68126ff8-8a74-4130-ba3f-a806dc4a8b2a"). InnerVolumeSpecName "kube-api-access-nbj2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.924865 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af817e85-9839-4aae-afdd-c764fac277a2-kube-api-access-5t29v" (OuterVolumeSpecName: "kube-api-access-5t29v") pod "af817e85-9839-4aae-afdd-c764fac277a2" (UID: "af817e85-9839-4aae-afdd-c764fac277a2"). InnerVolumeSpecName "kube-api-access-5t29v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.963854 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68126ff8-8a74-4130-ba3f-a806dc4a8b2a" (UID: "68126ff8-8a74-4130-ba3f-a806dc4a8b2a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.964116 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68126ff8-8a74-4130-ba3f-a806dc4a8b2a" (UID: "68126ff8-8a74-4130-ba3f-a806dc4a8b2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.992916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-config" (OuterVolumeSpecName: "config") pod "68126ff8-8a74-4130-ba3f-a806dc4a8b2a" (UID: "68126ff8-8a74-4130-ba3f-a806dc4a8b2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:37 crc kubenswrapper[4749]: I0128 18:56:37.993147 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68126ff8-8a74-4130-ba3f-a806dc4a8b2a" (UID: "68126ff8-8a74-4130-ba3f-a806dc4a8b2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001039 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001080 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001097 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af817e85-9839-4aae-afdd-c764fac277a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001111 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbj2b\" (UniqueName: \"kubernetes.io/projected/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-kube-api-access-nbj2b\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001126 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001138 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t29v\" (UniqueName: \"kubernetes.io/projected/af817e85-9839-4aae-afdd-c764fac277a2-kube-api-access-5t29v\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001151 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4f07580-cd44-43b4-a459-69d3984e1c09-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001162 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwf2r\" (UniqueName: \"kubernetes.io/projected/a4f07580-cd44-43b4-a459-69d3984e1c09-kube-api-access-kwf2r\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.001173 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68126ff8-8a74-4130-ba3f-a806dc4a8b2a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.005598 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.049943 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vjt8b"] Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.050785 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerName="dnsmasq-dns" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.050804 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerName="dnsmasq-dns" Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.050828 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6f4e87-13b0-43ef-8a0c-d94025d5af0a" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.050837 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6f4e87-13b0-43ef-8a0c-d94025d5af0a" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.050852 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f07580-cd44-43b4-a459-69d3984e1c09" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.050861 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f07580-cd44-43b4-a459-69d3984e1c09" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.050873 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af817e85-9839-4aae-afdd-c764fac277a2" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.050880 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="af817e85-9839-4aae-afdd-c764fac277a2" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.050900 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fc78d3-bd48-48df-b646-a7d44ed0bd3b" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.050909 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fc78d3-bd48-48df-b646-a7d44ed0bd3b" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.050936 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93549c28-1d00-40bc-bddc-3e0d93b913f2" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.050944 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="93549c28-1d00-40bc-bddc-3e0d93b913f2" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.050962 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46c02d7-38cd-48a6-acd9-dc45c744ce86" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.050970 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46c02d7-38cd-48a6-acd9-dc45c744ce86" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.050982 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.050992 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: E0128 18:56:38.051007 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerName="init" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051015 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerName="init" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051237 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6f4e87-13b0-43ef-8a0c-d94025d5af0a" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051254 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f07580-cd44-43b4-a459-69d3984e1c09" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051268 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="93549c28-1d00-40bc-bddc-3e0d93b913f2" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051288 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fc78d3-bd48-48df-b646-a7d44ed0bd3b" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051306 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46c02d7-38cd-48a6-acd9-dc45c744ce86" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051314 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerName="dnsmasq-dns" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051490 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef" containerName="mariadb-database-create" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.051507 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="af817e85-9839-4aae-afdd-c764fac277a2" containerName="mariadb-account-create-update" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.053925 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.058424 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.059313 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b8x9h" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.073667 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vjt8b"] Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.102613 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-db-sync-config-data\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.102680 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56fr8\" (UniqueName: \"kubernetes.io/projected/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-kube-api-access-56fr8\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.102703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-combined-ca-bundle\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.102910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-config-data\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.155197 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-fc3a-account-create-update-jwst7"] Jan 28 18:56:38 crc kubenswrapper[4749]: W0128 18:56:38.158367 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea422f17_7b59_48ad_8434_557e5c0a6096.slice/crio-ba0ccba53748bcfe0e2ebd497a7ede6dbf8cb6efe3d89e0ff11ad3bb5ab8b7d8 WatchSource:0}: Error finding container ba0ccba53748bcfe0e2ebd497a7ede6dbf8cb6efe3d89e0ff11ad3bb5ab8b7d8: Status 404 returned error can't find the container with id ba0ccba53748bcfe0e2ebd497a7ede6dbf8cb6efe3d89e0ff11ad3bb5ab8b7d8 Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.204471 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4ptx\" (UniqueName: \"kubernetes.io/projected/93549c28-1d00-40bc-bddc-3e0d93b913f2-kube-api-access-l4ptx\") pod \"93549c28-1d00-40bc-bddc-3e0d93b913f2\" (UID: \"93549c28-1d00-40bc-bddc-3e0d93b913f2\") " Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.208226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93549c28-1d00-40bc-bddc-3e0d93b913f2-operator-scripts\") pod \"93549c28-1d00-40bc-bddc-3e0d93b913f2\" (UID: \"93549c28-1d00-40bc-bddc-3e0d93b913f2\") " Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.209042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56fr8\" (UniqueName: \"kubernetes.io/projected/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-kube-api-access-56fr8\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.209232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-combined-ca-bundle\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.208710 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93549c28-1d00-40bc-bddc-3e0d93b913f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93549c28-1d00-40bc-bddc-3e0d93b913f2" (UID: "93549c28-1d00-40bc-bddc-3e0d93b913f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.209233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93549c28-1d00-40bc-bddc-3e0d93b913f2-kube-api-access-l4ptx" (OuterVolumeSpecName: "kube-api-access-l4ptx") pod "93549c28-1d00-40bc-bddc-3e0d93b913f2" (UID: "93549c28-1d00-40bc-bddc-3e0d93b913f2"). InnerVolumeSpecName "kube-api-access-l4ptx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.210348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-config-data\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.210568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-db-sync-config-data\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.210763 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4ptx\" (UniqueName: \"kubernetes.io/projected/93549c28-1d00-40bc-bddc-3e0d93b913f2-kube-api-access-l4ptx\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.210833 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93549c28-1d00-40bc-bddc-3e0d93b913f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.213371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-db-sync-config-data\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.213458 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-config-data\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.214242 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-combined-ca-bundle\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.218875 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-znzw7"] Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.224089 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56fr8\" (UniqueName: \"kubernetes.io/projected/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-kube-api-access-56fr8\") pod \"glance-db-sync-vjt8b\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.377874 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjt8b" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.477236 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" event={"ID":"ea422f17-7b59-48ad-8434-557e5c0a6096","Type":"ContainerStarted","Data":"ba0ccba53748bcfe0e2ebd497a7ede6dbf8cb6efe3d89e0ff11ad3bb5ab8b7d8"} Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.479152 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-80e6-account-create-update-zqbb9" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.479165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-80e6-account-create-update-zqbb9" event={"ID":"93549c28-1d00-40bc-bddc-3e0d93b913f2","Type":"ContainerDied","Data":"cd66ae25490012941c0abb330f742a01de28a0830b399b4d105c6a9d772e920c"} Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.479253 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd66ae25490012941c0abb330f742a01de28a0830b399b4d105c6a9d772e920c" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.487754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" event={"ID":"68126ff8-8a74-4130-ba3f-a806dc4a8b2a","Type":"ContainerDied","Data":"fb152edf0a955dac2809add73f6b8c21c68d0839db10eaaf17e12eeaf0aa38e4"} Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.488000 4749 scope.go:117] "RemoveContainer" containerID="78648d519e6211ce1a563e570369fdd79e89a5cffbb3dd087d5a84fc0c978a12" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.488228 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.491942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a33-account-create-update-wx8kt" event={"ID":"af817e85-9839-4aae-afdd-c764fac277a2","Type":"ContainerDied","Data":"da82686f6ba7ab747f2ad95fcbba546e6637301af6b374497baddcbf86721436"} Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.491986 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da82686f6ba7ab747f2ad95fcbba546e6637301af6b374497baddcbf86721436" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.492072 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a33-account-create-update-wx8kt" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.495865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jn56l" event={"ID":"a4f07580-cd44-43b4-a459-69d3984e1c09","Type":"ContainerDied","Data":"6306315ed1f1d1c5697ca3d4dc69e9b6e565059656cc3989f04c57f3acfee2f0"} Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.495999 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6306315ed1f1d1c5697ca3d4dc69e9b6e565059656cc3989f04c57f3acfee2f0" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.496151 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jn56l" Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.536778 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s9fdf"] Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.545647 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s9fdf"] Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.886886 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" path="/var/lib/kubelet/pods/68126ff8-8a74-4130-ba3f-a806dc4a8b2a/volumes" Jan 28 18:56:38 crc kubenswrapper[4749]: W0128 18:56:38.939253 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod094de6d7_0ad5_4985_9894_b004751d377b.slice/crio-70735cfb7f505b3fef5455251871e693af9ac37a11b6bc1d6a4b7538b00e1bef WatchSource:0}: Error finding container 70735cfb7f505b3fef5455251871e693af9ac37a11b6bc1d6a4b7538b00e1bef: Status 404 returned error can't find the container with id 70735cfb7f505b3fef5455251871e693af9ac37a11b6bc1d6a4b7538b00e1bef Jan 28 18:56:38 crc kubenswrapper[4749]: I0128 18:56:38.954066 4749 scope.go:117] "RemoveContainer" containerID="6bb457f7c110f4dff20e8b76a24c3c9c1682944c7d1e1c2e6539e761b121cde2" Jan 28 18:56:39 crc kubenswrapper[4749]: I0128 18:56:39.507963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" event={"ID":"094de6d7-0ad5-4985-9894-b004751d377b","Type":"ContainerStarted","Data":"70735cfb7f505b3fef5455251871e693af9ac37a11b6bc1d6a4b7538b00e1bef"} Jan 28 18:56:39 crc kubenswrapper[4749]: W0128 18:56:39.579036 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc395eb86_c6bf_4d4b_b0dd_b19cc34c004f.slice/crio-3ad85c2020bf280c3c3c6507b0fd47b562ef153f9985f6dc13736d8a09df8ba4 WatchSource:0}: Error finding container 3ad85c2020bf280c3c3c6507b0fd47b562ef153f9985f6dc13736d8a09df8ba4: Status 404 returned error can't find the container with id 3ad85c2020bf280c3c3c6507b0fd47b562ef153f9985f6dc13736d8a09df8ba4 Jan 28 18:56:39 crc kubenswrapper[4749]: I0128 18:56:39.583402 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vjt8b"] Jan 28 18:56:40 crc kubenswrapper[4749]: I0128 18:56:40.526545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mhwkd" event={"ID":"26e02b25-1356-40b3-b33a-947082d120e0","Type":"ContainerStarted","Data":"3bd3f1e2fd4a373e7db423ac4861e6efd382d2225d94229e11ad293aab0ee52a"} Jan 28 18:56:40 crc kubenswrapper[4749]: I0128 18:56:40.528091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjt8b" event={"ID":"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f","Type":"ContainerStarted","Data":"3ad85c2020bf280c3c3c6507b0fd47b562ef153f9985f6dc13736d8a09df8ba4"} Jan 28 18:56:40 crc kubenswrapper[4749]: I0128 18:56:40.529946 4749 generic.go:334] "Generic (PLEG): container finished" podID="094de6d7-0ad5-4985-9894-b004751d377b" containerID="e5331745763b3092bbc7057fbfedf8b2068079d2719742af5e58e3b422448336" exitCode=0 Jan 28 18:56:40 crc kubenswrapper[4749]: I0128 18:56:40.530007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" event={"ID":"094de6d7-0ad5-4985-9894-b004751d377b","Type":"ContainerDied","Data":"e5331745763b3092bbc7057fbfedf8b2068079d2719742af5e58e3b422448336"} Jan 28 18:56:40 crc kubenswrapper[4749]: I0128 18:56:40.532175 4749 generic.go:334] "Generic (PLEG): container finished" podID="ea422f17-7b59-48ad-8434-557e5c0a6096" containerID="5016d10624d9a3bea4c1f4a467059ec96f1445afa212377a3287ee0343464475" exitCode=0 Jan 28 18:56:40 crc kubenswrapper[4749]: I0128 18:56:40.532207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" event={"ID":"ea422f17-7b59-48ad-8434-557e5c0a6096","Type":"ContainerDied","Data":"5016d10624d9a3bea4c1f4a467059ec96f1445afa212377a3287ee0343464475"} Jan 28 18:56:40 crc kubenswrapper[4749]: I0128 18:56:40.555453 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mhwkd" podStartSLOduration=5.366025906 podStartE2EDuration="11.555424027s" podCreationTimestamp="2026-01-28 18:56:29 +0000 UTC" firstStartedPulling="2026-01-28 18:56:33.48113192 +0000 UTC m=+1261.492658695" lastFinishedPulling="2026-01-28 18:56:39.670530031 +0000 UTC m=+1267.682056816" observedRunningTime="2026-01-28 18:56:40.542868396 +0000 UTC m=+1268.554395191" watchObservedRunningTime="2026-01-28 18:56:40.555424027 +0000 UTC m=+1268.566950802" Jan 28 18:56:41 crc kubenswrapper[4749]: I0128 18:56:41.182797 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5sffp"] Jan 28 18:56:41 crc kubenswrapper[4749]: I0128 18:56:41.192113 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5sffp"] Jan 28 18:56:41 crc kubenswrapper[4749]: I0128 18:56:41.693964 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 28 18:56:41 crc kubenswrapper[4749]: I0128 18:56:41.711807 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-s9fdf" podUID="68126ff8-8a74-4130-ba3f-a806dc4a8b2a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Jan 28 18:56:41 crc kubenswrapper[4749]: I0128 18:56:41.786319 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:41 crc kubenswrapper[4749]: E0128 18:56:41.786725 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 18:56:41 crc kubenswrapper[4749]: E0128 18:56:41.786756 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 18:56:41 crc kubenswrapper[4749]: E0128 18:56:41.786810 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift podName:a1992d04-5d9f-498f-bee7-f2ab001feb76 nodeName:}" failed. No retries permitted until 2026-01-28 18:56:57.786790621 +0000 UTC m=+1285.798317396 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift") pod "swift-storage-0" (UID: "a1992d04-5d9f-498f-bee7-f2ab001feb76") : configmap "swift-ring-files" not found Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.105518 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.115490 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.299784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5gbx\" (UniqueName: \"kubernetes.io/projected/ea422f17-7b59-48ad-8434-557e5c0a6096-kube-api-access-j5gbx\") pod \"ea422f17-7b59-48ad-8434-557e5c0a6096\" (UID: \"ea422f17-7b59-48ad-8434-557e5c0a6096\") " Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.300148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea422f17-7b59-48ad-8434-557e5c0a6096-operator-scripts\") pod \"ea422f17-7b59-48ad-8434-557e5c0a6096\" (UID: \"ea422f17-7b59-48ad-8434-557e5c0a6096\") " Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.300193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5m2c\" (UniqueName: \"kubernetes.io/projected/094de6d7-0ad5-4985-9894-b004751d377b-kube-api-access-q5m2c\") pod \"094de6d7-0ad5-4985-9894-b004751d377b\" (UID: \"094de6d7-0ad5-4985-9894-b004751d377b\") " Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.300381 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/094de6d7-0ad5-4985-9894-b004751d377b-operator-scripts\") pod \"094de6d7-0ad5-4985-9894-b004751d377b\" (UID: \"094de6d7-0ad5-4985-9894-b004751d377b\") " Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.300815 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/094de6d7-0ad5-4985-9894-b004751d377b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "094de6d7-0ad5-4985-9894-b004751d377b" (UID: "094de6d7-0ad5-4985-9894-b004751d377b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.300904 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea422f17-7b59-48ad-8434-557e5c0a6096-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea422f17-7b59-48ad-8434-557e5c0a6096" (UID: "ea422f17-7b59-48ad-8434-557e5c0a6096"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.301445 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea422f17-7b59-48ad-8434-557e5c0a6096-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.301470 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/094de6d7-0ad5-4985-9894-b004751d377b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.305685 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/094de6d7-0ad5-4985-9894-b004751d377b-kube-api-access-q5m2c" (OuterVolumeSpecName: "kube-api-access-q5m2c") pod "094de6d7-0ad5-4985-9894-b004751d377b" (UID: "094de6d7-0ad5-4985-9894-b004751d377b"). InnerVolumeSpecName "kube-api-access-q5m2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.306095 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea422f17-7b59-48ad-8434-557e5c0a6096-kube-api-access-j5gbx" (OuterVolumeSpecName: "kube-api-access-j5gbx") pod "ea422f17-7b59-48ad-8434-557e5c0a6096" (UID: "ea422f17-7b59-48ad-8434-557e5c0a6096"). InnerVolumeSpecName "kube-api-access-j5gbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.402855 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5gbx\" (UniqueName: \"kubernetes.io/projected/ea422f17-7b59-48ad-8434-557e5c0a6096-kube-api-access-j5gbx\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.402889 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5m2c\" (UniqueName: \"kubernetes.io/projected/094de6d7-0ad5-4985-9894-b004751d377b-kube-api-access-q5m2c\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.555060 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerStarted","Data":"c9ba22418c9d94b102012de76724710f1f8d52f44fb7f214922903eddefa5a91"} Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.557122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" event={"ID":"094de6d7-0ad5-4985-9894-b004751d377b","Type":"ContainerDied","Data":"70735cfb7f505b3fef5455251871e693af9ac37a11b6bc1d6a4b7538b00e1bef"} Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.557181 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70735cfb7f505b3fef5455251871e693af9ac37a11b6bc1d6a4b7538b00e1bef" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.557271 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-znzw7" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.567467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" event={"ID":"ea422f17-7b59-48ad-8434-557e5c0a6096","Type":"ContainerDied","Data":"ba0ccba53748bcfe0e2ebd497a7ede6dbf8cb6efe3d89e0ff11ad3bb5ab8b7d8"} Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.567514 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0ccba53748bcfe0e2ebd497a7ede6dbf8cb6efe3d89e0ff11ad3bb5ab8b7d8" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.567517 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fc3a-account-create-update-jwst7" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.589854 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.137548424 podStartE2EDuration="58.589830819s" podCreationTimestamp="2026-01-28 18:55:44 +0000 UTC" firstStartedPulling="2026-01-28 18:55:59.451082347 +0000 UTC m=+1227.462609122" lastFinishedPulling="2026-01-28 18:56:41.903364742 +0000 UTC m=+1269.914891517" observedRunningTime="2026-01-28 18:56:42.580580909 +0000 UTC m=+1270.592107704" watchObservedRunningTime="2026-01-28 18:56:42.589830819 +0000 UTC m=+1270.601357594" Jan 28 18:56:42 crc kubenswrapper[4749]: I0128 18:56:42.904737 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc6f4e87-13b0-43ef-8a0c-d94025d5af0a" path="/var/lib/kubelet/pods/fc6f4e87-13b0-43ef-8a0c-d94025d5af0a/volumes" Jan 28 18:56:44 crc kubenswrapper[4749]: I0128 18:56:44.587066 4749 generic.go:334] "Generic (PLEG): container finished" podID="19a12543-c0a3-486f-b5bd-4f2862c15a37" containerID="e1aa21e768bf671ed2327d9375f72fe33894fdcd7c739d2a4ad6e80fa40db665" exitCode=0 Jan 28 18:56:44 crc kubenswrapper[4749]: I0128 18:56:44.587145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a12543-c0a3-486f-b5bd-4f2862c15a37","Type":"ContainerDied","Data":"e1aa21e768bf671ed2327d9375f72fe33894fdcd7c739d2a4ad6e80fa40db665"} Jan 28 18:56:44 crc kubenswrapper[4749]: I0128 18:56:44.591054 4749 generic.go:334] "Generic (PLEG): container finished" podID="5954ab85-e42a-498a-ae91-fd46445c0860" containerID="141990bcc151dcca3b6ff439514702fe0b63c85be8fb7fd086262326aeaa5229" exitCode=0 Jan 28 18:56:44 crc kubenswrapper[4749]: I0128 18:56:44.591095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5954ab85-e42a-498a-ae91-fd46445c0860","Type":"ContainerDied","Data":"141990bcc151dcca3b6ff439514702fe0b63c85be8fb7fd086262326aeaa5229"} Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.131761 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 18:56:45 crc kubenswrapper[4749]: E0128 18:56:45.132343 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094de6d7-0ad5-4985-9894-b004751d377b" containerName="mariadb-database-create" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.132365 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="094de6d7-0ad5-4985-9894-b004751d377b" containerName="mariadb-database-create" Jan 28 18:56:45 crc kubenswrapper[4749]: E0128 18:56:45.132408 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea422f17-7b59-48ad-8434-557e5c0a6096" containerName="mariadb-account-create-update" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.132417 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea422f17-7b59-48ad-8434-557e5c0a6096" containerName="mariadb-account-create-update" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.132631 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea422f17-7b59-48ad-8434-557e5c0a6096" containerName="mariadb-account-create-update" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.132678 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="094de6d7-0ad5-4985-9894-b004751d377b" containerName="mariadb-database-create" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.135205 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.138673 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.144718 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.277647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-config-data\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.277863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.278030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvf56\" (UniqueName: \"kubernetes.io/projected/63120686-40c3-4585-9e76-26484d36c17b-kube-api-access-fvf56\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.381720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-config-data\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.381854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.381931 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvf56\" (UniqueName: \"kubernetes.io/projected/63120686-40c3-4585-9e76-26484d36c17b-kube-api-access-fvf56\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.388216 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.389043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-config-data\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.405034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvf56\" (UniqueName: \"kubernetes.io/projected/63120686-40c3-4585-9e76-26484d36c17b-kube-api-access-fvf56\") pod \"mysqld-exporter-0\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.468925 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.607940 4749 generic.go:334] "Generic (PLEG): container finished" podID="94850daa-65af-4e6a-ad29-cfa28c3076e7" containerID="8ade27226a2163e9471db289c245657a1eeccd0ff5642100b144b2252989fa2d" exitCode=0 Jan 28 18:56:45 crc kubenswrapper[4749]: I0128 18:56:45.607992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"94850daa-65af-4e6a-ad29-cfa28c3076e7","Type":"ContainerDied","Data":"8ade27226a2163e9471db289c245657a1eeccd0ff5642100b144b2252989fa2d"} Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.226911 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.226985 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.229687 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.272835 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xkbsr"] Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.274637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.276756 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.301729 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xkbsr"] Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.400967 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-operator-scripts\") pod \"root-account-create-update-xkbsr\" (UID: \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\") " pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.401064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccj4v\" (UniqueName: \"kubernetes.io/projected/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-kube-api-access-ccj4v\") pod \"root-account-create-update-xkbsr\" (UID: \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\") " pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.504266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-operator-scripts\") pod \"root-account-create-update-xkbsr\" (UID: \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\") " pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.504356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccj4v\" (UniqueName: \"kubernetes.io/projected/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-kube-api-access-ccj4v\") pod \"root-account-create-update-xkbsr\" (UID: \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\") " pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.505116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-operator-scripts\") pod \"root-account-create-update-xkbsr\" (UID: \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\") " pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.527128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccj4v\" (UniqueName: \"kubernetes.io/projected/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-kube-api-access-ccj4v\") pod \"root-account-create-update-xkbsr\" (UID: \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\") " pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.596373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:46 crc kubenswrapper[4749]: I0128 18:56:46.621010 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:47 crc kubenswrapper[4749]: I0128 18:56:47.629203 4749 generic.go:334] "Generic (PLEG): container finished" podID="26e02b25-1356-40b3-b33a-947082d120e0" containerID="3bd3f1e2fd4a373e7db423ac4861e6efd382d2225d94229e11ad293aab0ee52a" exitCode=0 Jan 28 18:56:47 crc kubenswrapper[4749]: I0128 18:56:47.629590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mhwkd" event={"ID":"26e02b25-1356-40b3-b33a-947082d120e0","Type":"ContainerDied","Data":"3bd3f1e2fd4a373e7db423ac4861e6efd382d2225d94229e11ad293aab0ee52a"} Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.355009 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-g8gw7" podUID="ca89079d-f8be-4e31-a78d-1c4257260a8f" containerName="ovn-controller" probeResult="failure" output=< Jan 28 18:56:49 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 28 18:56:49 crc kubenswrapper[4749]: > Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.368913 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.394463 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7z7wf" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.527421 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.527961 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="prometheus" containerID="cri-o://67106dc4864cfe613cad5fbd43a0e263e8a5d0985918253bff078189ad45e2e4" gracePeriod=600 Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.528075 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="thanos-sidecar" containerID="cri-o://c9ba22418c9d94b102012de76724710f1f8d52f44fb7f214922903eddefa5a91" gracePeriod=600 Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.528102 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="config-reloader" containerID="cri-o://2fcdbe63b928fbeef4417b8237eb37201ff476197135cdd08ee936f92f047061" gracePeriod=600 Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.600082 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-g8gw7-config-gf92b"] Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.601872 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.606923 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.609837 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g8gw7-config-gf92b"] Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.687912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.687977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-scripts\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.688378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgxf9\" (UniqueName: \"kubernetes.io/projected/ff54120f-d967-42cf-989f-1136ab0d5746-kube-api-access-hgxf9\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.688485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-log-ovn\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.688573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run-ovn\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.688795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-additional-scripts\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-scripts\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791251 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgxf9\" (UniqueName: \"kubernetes.io/projected/ff54120f-d967-42cf-989f-1136ab0d5746-kube-api-access-hgxf9\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-log-ovn\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791319 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run-ovn\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-additional-scripts\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-log-ovn\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run-ovn\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.791741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.792287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-additional-scripts\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.795019 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-scripts\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.825099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgxf9\" (UniqueName: \"kubernetes.io/projected/ff54120f-d967-42cf-989f-1136ab0d5746-kube-api-access-hgxf9\") pod \"ovn-controller-g8gw7-config-gf92b\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:49 crc kubenswrapper[4749]: I0128 18:56:49.930185 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:50 crc kubenswrapper[4749]: I0128 18:56:50.668441 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerID="c9ba22418c9d94b102012de76724710f1f8d52f44fb7f214922903eddefa5a91" exitCode=0 Jan 28 18:56:50 crc kubenswrapper[4749]: I0128 18:56:50.668771 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerID="2fcdbe63b928fbeef4417b8237eb37201ff476197135cdd08ee936f92f047061" exitCode=0 Jan 28 18:56:50 crc kubenswrapper[4749]: I0128 18:56:50.668784 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerID="67106dc4864cfe613cad5fbd43a0e263e8a5d0985918253bff078189ad45e2e4" exitCode=0 Jan 28 18:56:50 crc kubenswrapper[4749]: I0128 18:56:50.668515 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerDied","Data":"c9ba22418c9d94b102012de76724710f1f8d52f44fb7f214922903eddefa5a91"} Jan 28 18:56:50 crc kubenswrapper[4749]: I0128 18:56:50.668851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerDied","Data":"2fcdbe63b928fbeef4417b8237eb37201ff476197135cdd08ee936f92f047061"} Jan 28 18:56:50 crc kubenswrapper[4749]: I0128 18:56:50.668865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerDied","Data":"67106dc4864cfe613cad5fbd43a0e263e8a5d0985918253bff078189ad45e2e4"} Jan 28 18:56:50 crc kubenswrapper[4749]: I0128 18:56:50.670546 4749 generic.go:334] "Generic (PLEG): container finished" podID="772600b2-9086-4d72-bb86-6edfb0a21b35" containerID="0bf6c4cd26045235299e00809673399a4195061668568b9db557cc64814ea108" exitCode=0 Jan 28 18:56:50 crc kubenswrapper[4749]: I0128 18:56:50.670570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"772600b2-9086-4d72-bb86-6edfb0a21b35","Type":"ContainerDied","Data":"0bf6c4cd26045235299e00809673399a4195061668568b9db557cc64814ea108"} Jan 28 18:56:51 crc kubenswrapper[4749]: I0128 18:56:51.228836 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.138:9090/-/ready\": dial tcp 10.217.0.138:9090: connect: connection refused" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.179924 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.243046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-swiftconf\") pod \"26e02b25-1356-40b3-b33a-947082d120e0\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.243430 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-combined-ca-bundle\") pod \"26e02b25-1356-40b3-b33a-947082d120e0\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.243469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-scripts\") pod \"26e02b25-1356-40b3-b33a-947082d120e0\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.243488 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42chm\" (UniqueName: \"kubernetes.io/projected/26e02b25-1356-40b3-b33a-947082d120e0-kube-api-access-42chm\") pod \"26e02b25-1356-40b3-b33a-947082d120e0\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.243612 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26e02b25-1356-40b3-b33a-947082d120e0-etc-swift\") pod \"26e02b25-1356-40b3-b33a-947082d120e0\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.244236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-dispersionconf\") pod \"26e02b25-1356-40b3-b33a-947082d120e0\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.244428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-ring-data-devices\") pod \"26e02b25-1356-40b3-b33a-947082d120e0\" (UID: \"26e02b25-1356-40b3-b33a-947082d120e0\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.244516 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e02b25-1356-40b3-b33a-947082d120e0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "26e02b25-1356-40b3-b33a-947082d120e0" (UID: "26e02b25-1356-40b3-b33a-947082d120e0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.244947 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26e02b25-1356-40b3-b33a-947082d120e0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.245463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "26e02b25-1356-40b3-b33a-947082d120e0" (UID: "26e02b25-1356-40b3-b33a-947082d120e0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.250934 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e02b25-1356-40b3-b33a-947082d120e0-kube-api-access-42chm" (OuterVolumeSpecName: "kube-api-access-42chm") pod "26e02b25-1356-40b3-b33a-947082d120e0" (UID: "26e02b25-1356-40b3-b33a-947082d120e0"). InnerVolumeSpecName "kube-api-access-42chm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.258479 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "26e02b25-1356-40b3-b33a-947082d120e0" (UID: "26e02b25-1356-40b3-b33a-947082d120e0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.285210 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "26e02b25-1356-40b3-b33a-947082d120e0" (UID: "26e02b25-1356-40b3-b33a-947082d120e0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.287487 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26e02b25-1356-40b3-b33a-947082d120e0" (UID: "26e02b25-1356-40b3-b33a-947082d120e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.304300 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-scripts" (OuterVolumeSpecName: "scripts") pod "26e02b25-1356-40b3-b33a-947082d120e0" (UID: "26e02b25-1356-40b3-b33a-947082d120e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.346282 4749 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.346571 4749 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.346584 4749 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.346608 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e02b25-1356-40b3-b33a-947082d120e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.346618 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26e02b25-1356-40b3-b33a-947082d120e0-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.346626 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42chm\" (UniqueName: \"kubernetes.io/projected/26e02b25-1356-40b3-b33a-947082d120e0-kube-api-access-42chm\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.454268 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.551620 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4dbx\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-kube-api-access-c4dbx\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.551679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-thanos-prometheus-http-client-file\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.551732 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-1\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.551784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-2\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.551807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-web-config\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.551899 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.551953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config-out\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.552117 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.552177 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-0\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.552236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-tls-assets\") pod \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\" (UID: \"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b\") " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.553979 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.560654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.561065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.562601 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config-out" (OuterVolumeSpecName: "config-out") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.562796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.564570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-kube-api-access-c4dbx" (OuterVolumeSpecName: "kube-api-access-c4dbx") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "kube-api-access-c4dbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.568909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config" (OuterVolumeSpecName: "config") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.574520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.597739 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.616501 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-web-config" (OuterVolumeSpecName: "web-config") pod "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" (UID: "5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658128 4749 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config-out\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658229 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") on node \"crc\" " Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658246 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658259 4749 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658273 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4dbx\" (UniqueName: \"kubernetes.io/projected/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-kube-api-access-c4dbx\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658289 4749 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658302 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658317 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658350 4749 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-web-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.658364 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.694059 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.695371 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c") on node "crc" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.703544 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.710593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5954ab85-e42a-498a-ae91-fd46445c0860","Type":"ContainerStarted","Data":"533bd76410e74d82ae3930801664b79143e8b5faf9046804d32adb32ee293902"} Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.711435 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.718908 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xkbsr"] Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.725536 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"772600b2-9086-4d72-bb86-6edfb0a21b35","Type":"ContainerStarted","Data":"c079b2903b80eb6f706e8ef61fecb04fc9379663f8e3193e4b596721c67da74b"} Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.726203 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.734241 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=64.174994524 podStartE2EDuration="1m14.734222331s" podCreationTimestamp="2026-01-28 18:55:38 +0000 UTC" firstStartedPulling="2026-01-28 18:55:58.880570784 +0000 UTC m=+1226.892097569" lastFinishedPulling="2026-01-28 18:56:09.439798601 +0000 UTC m=+1237.451325376" observedRunningTime="2026-01-28 18:56:52.732413656 +0000 UTC m=+1280.743940461" watchObservedRunningTime="2026-01-28 18:56:52.734222331 +0000 UTC m=+1280.745749106" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.740121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"94850daa-65af-4e6a-ad29-cfa28c3076e7","Type":"ContainerStarted","Data":"40cee00a63d88665c5a9948db85e424ef757d3987e1e61d3f2092f917f5f58c2"} Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.741166 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.765051 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.766106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b","Type":"ContainerDied","Data":"23f715f017cd27f19d95dadab0bcb6898bf9d9776105967a2cbce19d59e09467"} Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.766468 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.769495 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=64.125736034 podStartE2EDuration="1m14.769478635s" podCreationTimestamp="2026-01-28 18:55:38 +0000 UTC" firstStartedPulling="2026-01-28 18:55:58.886707297 +0000 UTC m=+1226.898234072" lastFinishedPulling="2026-01-28 18:56:09.530449898 +0000 UTC m=+1237.541976673" observedRunningTime="2026-01-28 18:56:52.768476081 +0000 UTC m=+1280.780002866" watchObservedRunningTime="2026-01-28 18:56:52.769478635 +0000 UTC m=+1280.781005410" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.770071 4749 scope.go:117] "RemoveContainer" containerID="c9ba22418c9d94b102012de76724710f1f8d52f44fb7f214922903eddefa5a91" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.779181 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a12543-c0a3-486f-b5bd-4f2862c15a37","Type":"ContainerStarted","Data":"9bd14f84cc33eaf72f003dd29b098684c18510c47bb7c8a7b5e454c6597b4454"} Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.780315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.789576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mhwkd" event={"ID":"26e02b25-1356-40b3-b33a-947082d120e0","Type":"ContainerDied","Data":"5ea06e475210f467037ea1e36b441637b13b5adb3a4d32f7fa8cc59d75c57b7f"} Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.789615 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ea06e475210f467037ea1e36b441637b13b5adb3a4d32f7fa8cc59d75c57b7f" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.789674 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mhwkd" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.810930 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=64.078676997 podStartE2EDuration="1m14.810911912s" podCreationTimestamp="2026-01-28 18:55:38 +0000 UTC" firstStartedPulling="2026-01-28 18:55:58.868523266 +0000 UTC m=+1226.880050041" lastFinishedPulling="2026-01-28 18:56:09.600758181 +0000 UTC m=+1237.612284956" observedRunningTime="2026-01-28 18:56:52.810069101 +0000 UTC m=+1280.821595886" watchObservedRunningTime="2026-01-28 18:56:52.810911912 +0000 UTC m=+1280.822438687" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.850844 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-g8gw7-config-gf92b"] Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.864859 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=64.484695212 podStartE2EDuration="1m14.864786678s" podCreationTimestamp="2026-01-28 18:55:38 +0000 UTC" firstStartedPulling="2026-01-28 18:55:58.891244189 +0000 UTC m=+1226.902770964" lastFinishedPulling="2026-01-28 18:56:09.271335655 +0000 UTC m=+1237.282862430" observedRunningTime="2026-01-28 18:56:52.857530968 +0000 UTC m=+1280.869057763" watchObservedRunningTime="2026-01-28 18:56:52.864786678 +0000 UTC m=+1280.876313453" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.869570 4749 scope.go:117] "RemoveContainer" containerID="2fcdbe63b928fbeef4417b8237eb37201ff476197135cdd08ee936f92f047061" Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.934357 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.934390 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:56:52 crc kubenswrapper[4749]: I0128 18:56:52.934954 4749 scope.go:117] "RemoveContainer" containerID="67106dc4864cfe613cad5fbd43a0e263e8a5d0985918253bff078189ad45e2e4" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.015048 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.030299 4749 scope.go:117] "RemoveContainer" containerID="a78f9832d685bb0fcd3bdf9173db5cb564cf3359b95c2bea5f2b936a26aee910" Jan 28 18:56:53 crc kubenswrapper[4749]: E0128 18:56:53.053961 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="thanos-sidecar" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.054383 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="thanos-sidecar" Jan 28 18:56:53 crc kubenswrapper[4749]: E0128 18:56:53.055273 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="init-config-reloader" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.056001 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="init-config-reloader" Jan 28 18:56:53 crc kubenswrapper[4749]: E0128 18:56:53.056644 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e02b25-1356-40b3-b33a-947082d120e0" containerName="swift-ring-rebalance" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.056664 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e02b25-1356-40b3-b33a-947082d120e0" containerName="swift-ring-rebalance" Jan 28 18:56:53 crc kubenswrapper[4749]: E0128 18:56:53.057622 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="config-reloader" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.057642 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="config-reloader" Jan 28 18:56:53 crc kubenswrapper[4749]: E0128 18:56:53.057956 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="prometheus" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.058110 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="prometheus" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.067074 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="config-reloader" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.067304 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="thanos-sidecar" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.067437 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" containerName="prometheus" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.067664 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e02b25-1356-40b3-b33a-947082d120e0" containerName="swift-ring-rebalance" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.095205 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.095549 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.111717 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.111858 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.115056 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.115660 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.115784 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l2j2k" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.116075 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.116236 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.121540 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.126503 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.210768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-config\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.210824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.210854 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.210880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.210912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.211169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.211319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.211363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.211386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.211426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.211537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txb6p\" (UniqueName: \"kubernetes.io/projected/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-kube-api-access-txb6p\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.211563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.211640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313659 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313714 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txb6p\" (UniqueName: \"kubernetes.io/projected/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-kube-api-access-txb6p\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313899 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.313920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-config\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.317799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.319024 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.319063 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/744b74a6be0495f7b599fd99593ce019a865164c2389f972fc00046589d495f0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.320466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.321609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-config\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.322532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.322968 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.323979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.327252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.335604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.344654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.345027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txb6p\" (UniqueName: \"kubernetes.io/projected/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-kube-api-access-txb6p\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.345065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.353878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.409298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7df56fc7-e7af-4401-bce5-dbdb4c2ac53c\") pod \"prometheus-metric-storage-0\" (UID: \"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.502873 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.799245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xkbsr" event={"ID":"41d52d0a-0edd-4a1f-a19b-674518ed9e6e","Type":"ContainerStarted","Data":"455a9d44004cda7cea0bcfd4442740bfc7de6f54ab80713f3dc85ff239824e7b"} Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.799311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xkbsr" event={"ID":"41d52d0a-0edd-4a1f-a19b-674518ed9e6e","Type":"ContainerStarted","Data":"d00b4e23961e6ae81e1ed3311deae9ac7a9baf51a71b51673b834838207c4c85"} Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.801238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjt8b" event={"ID":"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f","Type":"ContainerStarted","Data":"b4df288d0fcf726ae5713c462e9a8a45c0baddbe762e49efc7dcc6880758d949"} Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.802433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8gw7-config-gf92b" event={"ID":"ff54120f-d967-42cf-989f-1136ab0d5746","Type":"ContainerStarted","Data":"7c4a90c24f22becb2583f63f54511522228301c51a85dfe6803b941024182a2a"} Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.805622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"63120686-40c3-4585-9e76-26484d36c17b","Type":"ContainerStarted","Data":"79f5f265d51673f545f42fe89f9bdd0e5e301dc2c9acfe0be3c9322e40c06904"} Jan 28 18:56:53 crc kubenswrapper[4749]: I0128 18:56:53.834733 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-xkbsr" podStartSLOduration=7.834716222 podStartE2EDuration="7.834716222s" podCreationTimestamp="2026-01-28 18:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:56:53.827182505 +0000 UTC m=+1281.838709290" watchObservedRunningTime="2026-01-28 18:56:53.834716222 +0000 UTC m=+1281.846242997" Jan 28 18:56:54 crc kubenswrapper[4749]: I0128 18:56:54.262756 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-g8gw7" podUID="ca89079d-f8be-4e31-a78d-1c4257260a8f" containerName="ovn-controller" probeResult="failure" output=< Jan 28 18:56:54 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 28 18:56:54 crc kubenswrapper[4749]: > Jan 28 18:56:54 crc kubenswrapper[4749]: I0128 18:56:54.824790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8gw7-config-gf92b" event={"ID":"ff54120f-d967-42cf-989f-1136ab0d5746","Type":"ContainerStarted","Data":"844b79c7c29a31700ab2caeb7e1f8d9265a8d8980a97a1502c8a99f6594c1965"} Jan 28 18:56:54 crc kubenswrapper[4749]: I0128 18:56:54.848584 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vjt8b" podStartSLOduration=4.190373356 podStartE2EDuration="16.848559594s" podCreationTimestamp="2026-01-28 18:56:38 +0000 UTC" firstStartedPulling="2026-01-28 18:56:39.581128515 +0000 UTC m=+1267.592655290" lastFinishedPulling="2026-01-28 18:56:52.239314753 +0000 UTC m=+1280.250841528" observedRunningTime="2026-01-28 18:56:53.857791104 +0000 UTC m=+1281.869317899" watchObservedRunningTime="2026-01-28 18:56:54.848559594 +0000 UTC m=+1282.860086369" Jan 28 18:56:54 crc kubenswrapper[4749]: I0128 18:56:54.866018 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-g8gw7-config-gf92b" podStartSLOduration=5.865993287 podStartE2EDuration="5.865993287s" podCreationTimestamp="2026-01-28 18:56:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:56:54.841061138 +0000 UTC m=+1282.852587923" watchObservedRunningTime="2026-01-28 18:56:54.865993287 +0000 UTC m=+1282.877520062" Jan 28 18:56:54 crc kubenswrapper[4749]: I0128 18:56:54.901656 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b" path="/var/lib/kubelet/pods/5ce4d6a9-0e4d-4da3-903d-6d29e70d6b0b/volumes" Jan 28 18:56:55 crc kubenswrapper[4749]: I0128 18:56:55.440049 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 18:56:55 crc kubenswrapper[4749]: W0128 18:56:55.715623 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b79ef2_2b8d_486b_a757_a1a5b7dc8ee3.slice/crio-7ea05360d0f9ff95b20bfc407e0723a12c5062773e3eaa8de8665a6a8a05ea9b WatchSource:0}: Error finding container 7ea05360d0f9ff95b20bfc407e0723a12c5062773e3eaa8de8665a6a8a05ea9b: Status 404 returned error can't find the container with id 7ea05360d0f9ff95b20bfc407e0723a12c5062773e3eaa8de8665a6a8a05ea9b Jan 28 18:56:55 crc kubenswrapper[4749]: I0128 18:56:55.879198 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff54120f-d967-42cf-989f-1136ab0d5746" containerID="844b79c7c29a31700ab2caeb7e1f8d9265a8d8980a97a1502c8a99f6594c1965" exitCode=0 Jan 28 18:56:55 crc kubenswrapper[4749]: I0128 18:56:55.879777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8gw7-config-gf92b" event={"ID":"ff54120f-d967-42cf-989f-1136ab0d5746","Type":"ContainerDied","Data":"844b79c7c29a31700ab2caeb7e1f8d9265a8d8980a97a1502c8a99f6594c1965"} Jan 28 18:56:55 crc kubenswrapper[4749]: I0128 18:56:55.886762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3","Type":"ContainerStarted","Data":"7ea05360d0f9ff95b20bfc407e0723a12c5062773e3eaa8de8665a6a8a05ea9b"} Jan 28 18:56:55 crc kubenswrapper[4749]: I0128 18:56:55.898143 4749 generic.go:334] "Generic (PLEG): container finished" podID="41d52d0a-0edd-4a1f-a19b-674518ed9e6e" containerID="455a9d44004cda7cea0bcfd4442740bfc7de6f54ab80713f3dc85ff239824e7b" exitCode=0 Jan 28 18:56:55 crc kubenswrapper[4749]: I0128 18:56:55.898200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xkbsr" event={"ID":"41d52d0a-0edd-4a1f-a19b-674518ed9e6e","Type":"ContainerDied","Data":"455a9d44004cda7cea0bcfd4442740bfc7de6f54ab80713f3dc85ff239824e7b"} Jan 28 18:56:56 crc kubenswrapper[4749]: I0128 18:56:56.910621 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"63120686-40c3-4585-9e76-26484d36c17b","Type":"ContainerStarted","Data":"d639b8a3bfe72ee55142babc3e97dabc8b78b59375035dbdb08f0ddef4a8bec4"} Jan 28 18:56:56 crc kubenswrapper[4749]: I0128 18:56:56.933680 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=8.866823777 podStartE2EDuration="11.933658843s" podCreationTimestamp="2026-01-28 18:56:45 +0000 UTC" firstStartedPulling="2026-01-28 18:56:52.708887943 +0000 UTC m=+1280.720414718" lastFinishedPulling="2026-01-28 18:56:55.775723009 +0000 UTC m=+1283.787249784" observedRunningTime="2026-01-28 18:56:56.927267324 +0000 UTC m=+1284.938794109" watchObservedRunningTime="2026-01-28 18:56:56.933658843 +0000 UTC m=+1284.945185608" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.460969 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.466790 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.466826 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.468214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-log-ovn\") pod \"ff54120f-d967-42cf-989f-1136ab0d5746\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-additional-scripts\") pod \"ff54120f-d967-42cf-989f-1136ab0d5746\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ff54120f-d967-42cf-989f-1136ab0d5746" (UID: "ff54120f-d967-42cf-989f-1136ab0d5746"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run-ovn\") pod \"ff54120f-d967-42cf-989f-1136ab0d5746\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-scripts\") pod \"ff54120f-d967-42cf-989f-1136ab0d5746\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ff54120f-d967-42cf-989f-1136ab0d5746" (UID: "ff54120f-d967-42cf-989f-1136ab0d5746"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634732 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run\") pod \"ff54120f-d967-42cf-989f-1136ab0d5746\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634761 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccj4v\" (UniqueName: \"kubernetes.io/projected/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-kube-api-access-ccj4v\") pod \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\" (UID: \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\") " Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run" (OuterVolumeSpecName: "var-run") pod "ff54120f-d967-42cf-989f-1136ab0d5746" (UID: "ff54120f-d967-42cf-989f-1136ab0d5746"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgxf9\" (UniqueName: \"kubernetes.io/projected/ff54120f-d967-42cf-989f-1136ab0d5746-kube-api-access-hgxf9\") pod \"ff54120f-d967-42cf-989f-1136ab0d5746\" (UID: \"ff54120f-d967-42cf-989f-1136ab0d5746\") " Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.634857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-operator-scripts\") pod \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\" (UID: \"41d52d0a-0edd-4a1f-a19b-674518ed9e6e\") " Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.635355 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.635377 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-run\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.635386 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff54120f-d967-42cf-989f-1136ab0d5746-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.635549 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41d52d0a-0edd-4a1f-a19b-674518ed9e6e" (UID: "41d52d0a-0edd-4a1f-a19b-674518ed9e6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.635613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-scripts" (OuterVolumeSpecName: "scripts") pod "ff54120f-d967-42cf-989f-1136ab0d5746" (UID: "ff54120f-d967-42cf-989f-1136ab0d5746"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.635842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ff54120f-d967-42cf-989f-1136ab0d5746" (UID: "ff54120f-d967-42cf-989f-1136ab0d5746"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.641635 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-kube-api-access-ccj4v" (OuterVolumeSpecName: "kube-api-access-ccj4v") pod "41d52d0a-0edd-4a1f-a19b-674518ed9e6e" (UID: "41d52d0a-0edd-4a1f-a19b-674518ed9e6e"). InnerVolumeSpecName "kube-api-access-ccj4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.654881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff54120f-d967-42cf-989f-1136ab0d5746-kube-api-access-hgxf9" (OuterVolumeSpecName: "kube-api-access-hgxf9") pod "ff54120f-d967-42cf-989f-1136ab0d5746" (UID: "ff54120f-d967-42cf-989f-1136ab0d5746"). InnerVolumeSpecName "kube-api-access-hgxf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.737189 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.737229 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccj4v\" (UniqueName: \"kubernetes.io/projected/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-kube-api-access-ccj4v\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.737242 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgxf9\" (UniqueName: \"kubernetes.io/projected/ff54120f-d967-42cf-989f-1136ab0d5746-kube-api-access-hgxf9\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.737251 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d52d0a-0edd-4a1f-a19b-674518ed9e6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.737260 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff54120f-d967-42cf-989f-1136ab0d5746-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.839266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.845191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1992d04-5d9f-498f-bee7-f2ab001feb76-etc-swift\") pod \"swift-storage-0\" (UID: \"a1992d04-5d9f-498f-bee7-f2ab001feb76\") " pod="openstack/swift-storage-0" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.920872 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xkbsr" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.920911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xkbsr" event={"ID":"41d52d0a-0edd-4a1f-a19b-674518ed9e6e","Type":"ContainerDied","Data":"d00b4e23961e6ae81e1ed3311deae9ac7a9baf51a71b51673b834838207c4c85"} Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.920974 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d00b4e23961e6ae81e1ed3311deae9ac7a9baf51a71b51673b834838207c4c85" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.965852 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-g8gw7-config-gf92b" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.965909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-g8gw7-config-gf92b" event={"ID":"ff54120f-d967-42cf-989f-1136ab0d5746","Type":"ContainerDied","Data":"7c4a90c24f22becb2583f63f54511522228301c51a85dfe6803b941024182a2a"} Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.965944 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c4a90c24f22becb2583f63f54511522228301c51a85dfe6803b941024182a2a" Jan 28 18:56:57 crc kubenswrapper[4749]: I0128 18:56:57.999246 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 28 18:56:58 crc kubenswrapper[4749]: I0128 18:56:58.016347 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-g8gw7-config-gf92b"] Jan 28 18:56:58 crc kubenswrapper[4749]: I0128 18:56:58.032198 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-g8gw7-config-gf92b"] Jan 28 18:56:58 crc kubenswrapper[4749]: I0128 18:56:58.609133 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 28 18:56:58 crc kubenswrapper[4749]: W0128 18:56:58.623286 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1992d04_5d9f_498f_bee7_f2ab001feb76.slice/crio-7b2ff5cf4f73ebff6fa2a72948ddb00f5d53bcf487b30cef63f5271c6f9f9af4 WatchSource:0}: Error finding container 7b2ff5cf4f73ebff6fa2a72948ddb00f5d53bcf487b30cef63f5271c6f9f9af4: Status 404 returned error can't find the container with id 7b2ff5cf4f73ebff6fa2a72948ddb00f5d53bcf487b30cef63f5271c6f9f9af4 Jan 28 18:56:58 crc kubenswrapper[4749]: I0128 18:56:58.882396 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff54120f-d967-42cf-989f-1136ab0d5746" path="/var/lib/kubelet/pods/ff54120f-d967-42cf-989f-1136ab0d5746/volumes" Jan 28 18:56:58 crc kubenswrapper[4749]: I0128 18:56:58.975894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"7b2ff5cf4f73ebff6fa2a72948ddb00f5d53bcf487b30cef63f5271c6f9f9af4"} Jan 28 18:56:59 crc kubenswrapper[4749]: I0128 18:56:59.323195 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-g8gw7" Jan 28 18:56:59 crc kubenswrapper[4749]: I0128 18:56:59.985788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3","Type":"ContainerStarted","Data":"2e2569e48139ecd7c6305f52b132f80d4bfb12a53b8750d99b6e4617c3d91d7a"} Jan 28 18:57:00 crc kubenswrapper[4749]: I0128 18:57:00.999619 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"c4513a9335312082a92477475f21a9071bbcaa6f72e6ef8499e5cc810927c8f2"} Jan 28 18:57:00 crc kubenswrapper[4749]: I0128 18:57:00.999979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"7c326b03f4686c20fbcf25adb0433483fee2623ac16fdd7b25de02178fa68627"} Jan 28 18:57:00 crc kubenswrapper[4749]: I0128 18:57:00.999993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"be5cd06c720d7bc4ef09062cfbc5f49e79e9baff5c848f85d8acbfa0beda0449"} Jan 28 18:57:01 crc kubenswrapper[4749]: I0128 18:57:01.000002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"73d1e7da468cef7d2d84418dbf4fa8506866fbb92c7f9ac309c0e0134d9488d0"} Jan 28 18:57:02 crc kubenswrapper[4749]: I0128 18:57:02.012733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"a3f54eeab8a7daa2fc3c40fefd6ac2b35e861d259b821cdb5d44ca1c56d19989"} Jan 28 18:57:03 crc kubenswrapper[4749]: I0128 18:57:03.024996 4749 generic.go:334] "Generic (PLEG): container finished" podID="c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" containerID="b4df288d0fcf726ae5713c462e9a8a45c0baddbe762e49efc7dcc6880758d949" exitCode=0 Jan 28 18:57:03 crc kubenswrapper[4749]: I0128 18:57:03.025349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjt8b" event={"ID":"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f","Type":"ContainerDied","Data":"b4df288d0fcf726ae5713c462e9a8a45c0baddbe762e49efc7dcc6880758d949"} Jan 28 18:57:03 crc kubenswrapper[4749]: I0128 18:57:03.030761 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"6ba3af458df237bcca1502aa9f91e134cde6a75d211175cd782d7238acf77af6"} Jan 28 18:57:03 crc kubenswrapper[4749]: I0128 18:57:03.030810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"0462e001be595e302daf74d243103e9dd4fe57e087dcb35b9af432c78899674e"} Jan 28 18:57:03 crc kubenswrapper[4749]: I0128 18:57:03.030824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"61d14c293f5f6a1df83b1d61af4fecaef4641e1612ec9c33525456fb762bdf13"} Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.052725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"a2d69849d31c8166474ee863d54b461e6fa2e9f41b13da799bc3da7a505b9146"} Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.493998 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjt8b" Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.617957 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56fr8\" (UniqueName: \"kubernetes.io/projected/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-kube-api-access-56fr8\") pod \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.618144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-db-sync-config-data\") pod \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.618168 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-combined-ca-bundle\") pod \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.618958 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-config-data\") pod \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\" (UID: \"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f\") " Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.622926 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" (UID: "c395eb86-c6bf-4d4b-b0dd-b19cc34c004f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.622975 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-kube-api-access-56fr8" (OuterVolumeSpecName: "kube-api-access-56fr8") pod "c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" (UID: "c395eb86-c6bf-4d4b-b0dd-b19cc34c004f"). InnerVolumeSpecName "kube-api-access-56fr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.648774 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" (UID: "c395eb86-c6bf-4d4b-b0dd-b19cc34c004f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.672879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-config-data" (OuterVolumeSpecName: "config-data") pod "c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" (UID: "c395eb86-c6bf-4d4b-b0dd-b19cc34c004f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.721878 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.721919 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56fr8\" (UniqueName: \"kubernetes.io/projected/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-kube-api-access-56fr8\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.721935 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:04 crc kubenswrapper[4749]: I0128 18:57:04.721947 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.088995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"848735b634543007f94bce705818fe8952969ab3b1ccb5d432df623a4d23a203"} Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.089319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"371b0ec3ae2b290ea883b62d2154f58d2fc5afaf9a60f008e76245bd08bc1b17"} Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.089350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"e5f0f72b783f7f76a2d9f1c4dba3e780d9b229e7712ece56867ed03663c26418"} Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.089360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"ea9b20fbfda8dad9c9bf0a8271834d04612c7cd8865d9a7672d26726a96ae872"} Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.089371 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"0b2ead1ae2464bb59314000613db09e40042bd33473fa98d0c7a41c0951d6399"} Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.090886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjt8b" event={"ID":"c395eb86-c6bf-4d4b-b0dd-b19cc34c004f","Type":"ContainerDied","Data":"3ad85c2020bf280c3c3c6507b0fd47b562ef153f9985f6dc13736d8a09df8ba4"} Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.090913 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ad85c2020bf280c3c3c6507b0fd47b562ef153f9985f6dc13736d8a09df8ba4" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.090983 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjt8b" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.441196 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-kt44f"] Jan 28 18:57:05 crc kubenswrapper[4749]: E0128 18:57:05.441639 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff54120f-d967-42cf-989f-1136ab0d5746" containerName="ovn-config" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.441662 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff54120f-d967-42cf-989f-1136ab0d5746" containerName="ovn-config" Jan 28 18:57:05 crc kubenswrapper[4749]: E0128 18:57:05.441681 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" containerName="glance-db-sync" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.441688 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" containerName="glance-db-sync" Jan 28 18:57:05 crc kubenswrapper[4749]: E0128 18:57:05.441713 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d52d0a-0edd-4a1f-a19b-674518ed9e6e" containerName="mariadb-account-create-update" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.441719 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d52d0a-0edd-4a1f-a19b-674518ed9e6e" containerName="mariadb-account-create-update" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.441921 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d52d0a-0edd-4a1f-a19b-674518ed9e6e" containerName="mariadb-account-create-update" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.441940 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff54120f-d967-42cf-989f-1136ab0d5746" containerName="ovn-config" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.441957 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" containerName="glance-db-sync" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.443046 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.466213 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-kt44f"] Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.544718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-config\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.544780 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.544811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.544983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ktbt\" (UniqueName: \"kubernetes.io/projected/34fcb22d-0bcc-4950-a04f-bcd402c0a591-kube-api-access-2ktbt\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.545203 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.646866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-config\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.646938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.646962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.647027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ktbt\" (UniqueName: \"kubernetes.io/projected/34fcb22d-0bcc-4950-a04f-bcd402c0a591-kube-api-access-2ktbt\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.647396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.648711 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.648942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-config\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.649031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.649100 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.665342 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ktbt\" (UniqueName: \"kubernetes.io/projected/34fcb22d-0bcc-4950-a04f-bcd402c0a591-kube-api-access-2ktbt\") pod \"dnsmasq-dns-5b946c75cc-kt44f\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:05 crc kubenswrapper[4749]: I0128 18:57:05.765627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.112621 4749 generic.go:334] "Generic (PLEG): container finished" podID="56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3" containerID="2e2569e48139ecd7c6305f52b132f80d4bfb12a53b8750d99b6e4617c3d91d7a" exitCode=0 Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.112897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3","Type":"ContainerDied","Data":"2e2569e48139ecd7c6305f52b132f80d4bfb12a53b8750d99b6e4617c3d91d7a"} Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.139221 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1992d04-5d9f-498f-bee7-f2ab001feb76","Type":"ContainerStarted","Data":"46dd27a4ae5a0158bffa14e5baab3512f5918a14743aaefc76a9b9d6645725b7"} Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.217821 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.152126015 podStartE2EDuration="42.21779611s" podCreationTimestamp="2026-01-28 18:56:24 +0000 UTC" firstStartedPulling="2026-01-28 18:56:58.625726218 +0000 UTC m=+1286.637252983" lastFinishedPulling="2026-01-28 18:57:03.691396303 +0000 UTC m=+1291.702923078" observedRunningTime="2026-01-28 18:57:06.204371378 +0000 UTC m=+1294.215898173" watchObservedRunningTime="2026-01-28 18:57:06.21779611 +0000 UTC m=+1294.229322885" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.291787 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-kt44f"] Jan 28 18:57:06 crc kubenswrapper[4749]: W0128 18:57:06.301143 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34fcb22d_0bcc_4950_a04f_bcd402c0a591.slice/crio-ff5f833a513f60ca012c75cc940a7b53e5dc3a2c3d4132808bf30ec12563c7da WatchSource:0}: Error finding container ff5f833a513f60ca012c75cc940a7b53e5dc3a2c3d4132808bf30ec12563c7da: Status 404 returned error can't find the container with id ff5f833a513f60ca012c75cc940a7b53e5dc3a2c3d4132808bf30ec12563c7da Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.594445 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-kt44f"] Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.632739 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7qlgx"] Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.634839 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.638428 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.643603 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7qlgx"] Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.771464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.771548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdwqj\" (UniqueName: \"kubernetes.io/projected/32eefe58-baa1-451c-9902-545a2b6e472b-kube-api-access-sdwqj\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.771582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.771603 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.771640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-config\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.771658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.873462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-config\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.873712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.873926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.874069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdwqj\" (UniqueName: \"kubernetes.io/projected/32eefe58-baa1-451c-9902-545a2b6e472b-kube-api-access-sdwqj\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.874186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.874279 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.875416 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.876148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-config\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.876774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.877057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.877236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.892871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdwqj\" (UniqueName: \"kubernetes.io/projected/32eefe58-baa1-451c-9902-545a2b6e472b-kube-api-access-sdwqj\") pod \"dnsmasq-dns-74f6bcbc87-7qlgx\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:06 crc kubenswrapper[4749]: I0128 18:57:06.953179 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:07 crc kubenswrapper[4749]: I0128 18:57:07.156529 4749 generic.go:334] "Generic (PLEG): container finished" podID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" containerID="7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a" exitCode=0 Jan 28 18:57:07 crc kubenswrapper[4749]: I0128 18:57:07.156728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" event={"ID":"34fcb22d-0bcc-4950-a04f-bcd402c0a591","Type":"ContainerDied","Data":"7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a"} Jan 28 18:57:07 crc kubenswrapper[4749]: I0128 18:57:07.156817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" event={"ID":"34fcb22d-0bcc-4950-a04f-bcd402c0a591","Type":"ContainerStarted","Data":"ff5f833a513f60ca012c75cc940a7b53e5dc3a2c3d4132808bf30ec12563c7da"} Jan 28 18:57:07 crc kubenswrapper[4749]: I0128 18:57:07.160672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3","Type":"ContainerStarted","Data":"33191812c6e8dcd57d02134bad2ed6246463b511a1c9416fea915f9b5e056ddd"} Jan 28 18:57:07 crc kubenswrapper[4749]: I0128 18:57:07.420465 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7qlgx"] Jan 28 18:57:07 crc kubenswrapper[4749]: W0128 18:57:07.427591 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32eefe58_baa1_451c_9902_545a2b6e472b.slice/crio-d8a98bcb47fc3e24d159e8663a3b1994c2a315bdadbea4667171a0570edea63f WatchSource:0}: Error finding container d8a98bcb47fc3e24d159e8663a3b1994c2a315bdadbea4667171a0570edea63f: Status 404 returned error can't find the container with id d8a98bcb47fc3e24d159e8663a3b1994c2a315bdadbea4667171a0570edea63f Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.175134 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" event={"ID":"34fcb22d-0bcc-4950-a04f-bcd402c0a591","Type":"ContainerStarted","Data":"88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51"} Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.175458 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" podUID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" containerName="dnsmasq-dns" containerID="cri-o://88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51" gracePeriod=10 Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.175842 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.181108 4749 generic.go:334] "Generic (PLEG): container finished" podID="32eefe58-baa1-451c-9902-545a2b6e472b" containerID="b9e2517c4b2feebde90e443652a97f7363ab7cae9c4a3d2863497e77939ce474" exitCode=0 Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.181184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" event={"ID":"32eefe58-baa1-451c-9902-545a2b6e472b","Type":"ContainerDied","Data":"b9e2517c4b2feebde90e443652a97f7363ab7cae9c4a3d2863497e77939ce474"} Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.181259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" event={"ID":"32eefe58-baa1-451c-9902-545a2b6e472b","Type":"ContainerStarted","Data":"d8a98bcb47fc3e24d159e8663a3b1994c2a315bdadbea4667171a0570edea63f"} Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.211477 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" podStartSLOduration=3.211447452 podStartE2EDuration="3.211447452s" podCreationTimestamp="2026-01-28 18:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:08.207271658 +0000 UTC m=+1296.218798443" watchObservedRunningTime="2026-01-28 18:57:08.211447452 +0000 UTC m=+1296.222974217" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.694603 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.837920 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-sb\") pod \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.838030 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-nb\") pod \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.838092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ktbt\" (UniqueName: \"kubernetes.io/projected/34fcb22d-0bcc-4950-a04f-bcd402c0a591-kube-api-access-2ktbt\") pod \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.838138 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-dns-svc\") pod \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.838174 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-config\") pod \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\" (UID: \"34fcb22d-0bcc-4950-a04f-bcd402c0a591\") " Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.844797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34fcb22d-0bcc-4950-a04f-bcd402c0a591-kube-api-access-2ktbt" (OuterVolumeSpecName: "kube-api-access-2ktbt") pod "34fcb22d-0bcc-4950-a04f-bcd402c0a591" (UID: "34fcb22d-0bcc-4950-a04f-bcd402c0a591"). InnerVolumeSpecName "kube-api-access-2ktbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.899182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34fcb22d-0bcc-4950-a04f-bcd402c0a591" (UID: "34fcb22d-0bcc-4950-a04f-bcd402c0a591"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.909586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34fcb22d-0bcc-4950-a04f-bcd402c0a591" (UID: "34fcb22d-0bcc-4950-a04f-bcd402c0a591"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.909825 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34fcb22d-0bcc-4950-a04f-bcd402c0a591" (UID: "34fcb22d-0bcc-4950-a04f-bcd402c0a591"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.921748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-config" (OuterVolumeSpecName: "config") pod "34fcb22d-0bcc-4950-a04f-bcd402c0a591" (UID: "34fcb22d-0bcc-4950-a04f-bcd402c0a591"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.940766 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.940807 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.940818 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ktbt\" (UniqueName: \"kubernetes.io/projected/34fcb22d-0bcc-4950-a04f-bcd402c0a591-kube-api-access-2ktbt\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.940830 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:08 crc kubenswrapper[4749]: I0128 18:57:08.940840 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34fcb22d-0bcc-4950-a04f-bcd402c0a591-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.192526 4749 generic.go:334] "Generic (PLEG): container finished" podID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" containerID="88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51" exitCode=0 Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.192567 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.192586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" event={"ID":"34fcb22d-0bcc-4950-a04f-bcd402c0a591","Type":"ContainerDied","Data":"88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51"} Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.193066 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-kt44f" event={"ID":"34fcb22d-0bcc-4950-a04f-bcd402c0a591","Type":"ContainerDied","Data":"ff5f833a513f60ca012c75cc940a7b53e5dc3a2c3d4132808bf30ec12563c7da"} Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.193090 4749 scope.go:117] "RemoveContainer" containerID="88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.196478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" event={"ID":"32eefe58-baa1-451c-9902-545a2b6e472b","Type":"ContainerStarted","Data":"f0d35a0a68776b9816bde9a279d5e05dd2f57a512425c663cb8942eab26967fb"} Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.196744 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.214619 4749 scope.go:117] "RemoveContainer" containerID="7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.225715 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" podStartSLOduration=3.225685472 podStartE2EDuration="3.225685472s" podCreationTimestamp="2026-01-28 18:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:09.219038037 +0000 UTC m=+1297.230564822" watchObservedRunningTime="2026-01-28 18:57:09.225685472 +0000 UTC m=+1297.237212267" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.254116 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-kt44f"] Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.263716 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-kt44f"] Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.373610 4749 scope.go:117] "RemoveContainer" containerID="88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51" Jan 28 18:57:09 crc kubenswrapper[4749]: E0128 18:57:09.374113 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51\": container with ID starting with 88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51 not found: ID does not exist" containerID="88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.374151 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51"} err="failed to get container status \"88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51\": rpc error: code = NotFound desc = could not find container \"88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51\": container with ID starting with 88cfdef4eb1a0285df4c2770e9b331e2e97e743cb6ad2b2790ec5cb3d8ab2a51 not found: ID does not exist" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.374179 4749 scope.go:117] "RemoveContainer" containerID="7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a" Jan 28 18:57:09 crc kubenswrapper[4749]: E0128 18:57:09.374689 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a\": container with ID starting with 7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a not found: ID does not exist" containerID="7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.374722 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a"} err="failed to get container status \"7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a\": rpc error: code = NotFound desc = could not find container \"7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a\": container with ID starting with 7303d192223a76454f53eb9594ea8363073856783e3b73a47782e4091673739a not found: ID does not exist" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.585380 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="19a12543-c0a3-486f-b5bd-4f2862c15a37" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.602917 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="94850daa-65af-4e6a-ad29-cfa28c3076e7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Jan 28 18:57:09 crc kubenswrapper[4749]: I0128 18:57:09.910542 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Jan 28 18:57:10 crc kubenswrapper[4749]: I0128 18:57:10.025524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 28 18:57:10 crc kubenswrapper[4749]: I0128 18:57:10.212025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3","Type":"ContainerStarted","Data":"47b657c0163b7d14e4df87d37e7fac56c810dfcd6ee25cf138b5352394665592"} Jan 28 18:57:10 crc kubenswrapper[4749]: I0128 18:57:10.212073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3","Type":"ContainerStarted","Data":"71f0d59c06114cba4111a113b58e5b88f8cc67c515294a5715cde1c5ce12bbf7"} Jan 28 18:57:10 crc kubenswrapper[4749]: I0128 18:57:10.882724 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" path="/var/lib/kubelet/pods/34fcb22d-0bcc-4950-a04f-bcd402c0a591/volumes" Jan 28 18:57:13 crc kubenswrapper[4749]: I0128 18:57:13.503058 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 28 18:57:16 crc kubenswrapper[4749]: I0128 18:57:16.954572 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.000445 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=25.000416813 podStartE2EDuration="25.000416813s" podCreationTimestamp="2026-01-28 18:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:10.235514704 +0000 UTC m=+1298.247041499" watchObservedRunningTime="2026-01-28 18:57:17.000416813 +0000 UTC m=+1305.011943628" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.037024 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7d8jv"] Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.037300 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-7d8jv" podUID="8903650b-b615-47c3-95e7-033bba3b379a" containerName="dnsmasq-dns" containerID="cri-o://a5e5d017fffb048aeb5ea24f8150a80f5a89277727e4e7310a203ab197fc06b2" gracePeriod=10 Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.286385 4749 generic.go:334] "Generic (PLEG): container finished" podID="8903650b-b615-47c3-95e7-033bba3b379a" containerID="a5e5d017fffb048aeb5ea24f8150a80f5a89277727e4e7310a203ab197fc06b2" exitCode=0 Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.286437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7d8jv" event={"ID":"8903650b-b615-47c3-95e7-033bba3b379a","Type":"ContainerDied","Data":"a5e5d017fffb048aeb5ea24f8150a80f5a89277727e4e7310a203ab197fc06b2"} Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.591289 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.739562 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-dns-svc\") pod \"8903650b-b615-47c3-95e7-033bba3b379a\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.739820 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-sb\") pod \"8903650b-b615-47c3-95e7-033bba3b379a\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.739973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-config\") pod \"8903650b-b615-47c3-95e7-033bba3b379a\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.740030 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-nb\") pod \"8903650b-b615-47c3-95e7-033bba3b379a\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.740098 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25sp8\" (UniqueName: \"kubernetes.io/projected/8903650b-b615-47c3-95e7-033bba3b379a-kube-api-access-25sp8\") pod \"8903650b-b615-47c3-95e7-033bba3b379a\" (UID: \"8903650b-b615-47c3-95e7-033bba3b379a\") " Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.750591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8903650b-b615-47c3-95e7-033bba3b379a-kube-api-access-25sp8" (OuterVolumeSpecName: "kube-api-access-25sp8") pod "8903650b-b615-47c3-95e7-033bba3b379a" (UID: "8903650b-b615-47c3-95e7-033bba3b379a"). InnerVolumeSpecName "kube-api-access-25sp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.795147 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8903650b-b615-47c3-95e7-033bba3b379a" (UID: "8903650b-b615-47c3-95e7-033bba3b379a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.795282 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8903650b-b615-47c3-95e7-033bba3b379a" (UID: "8903650b-b615-47c3-95e7-033bba3b379a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.797370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-config" (OuterVolumeSpecName: "config") pod "8903650b-b615-47c3-95e7-033bba3b379a" (UID: "8903650b-b615-47c3-95e7-033bba3b379a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.823387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8903650b-b615-47c3-95e7-033bba3b379a" (UID: "8903650b-b615-47c3-95e7-033bba3b379a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.842758 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.842800 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.842818 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25sp8\" (UniqueName: \"kubernetes.io/projected/8903650b-b615-47c3-95e7-033bba3b379a-kube-api-access-25sp8\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.842830 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:17 crc kubenswrapper[4749]: I0128 18:57:17.842841 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8903650b-b615-47c3-95e7-033bba3b379a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:18 crc kubenswrapper[4749]: I0128 18:57:18.297170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7d8jv" event={"ID":"8903650b-b615-47c3-95e7-033bba3b379a","Type":"ContainerDied","Data":"3fd2d56f89cba00975d76856651bbb41c380ba1a5e2e10a6ce07fe0c4efa9b3a"} Jan 28 18:57:18 crc kubenswrapper[4749]: I0128 18:57:18.297201 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7d8jv" Jan 28 18:57:18 crc kubenswrapper[4749]: I0128 18:57:18.297518 4749 scope.go:117] "RemoveContainer" containerID="a5e5d017fffb048aeb5ea24f8150a80f5a89277727e4e7310a203ab197fc06b2" Jan 28 18:57:18 crc kubenswrapper[4749]: I0128 18:57:18.328174 4749 scope.go:117] "RemoveContainer" containerID="759dfa449e6b8880e13e736e121ecf39a19cd9698500bca1e622ccbba25ff5e6" Jan 28 18:57:18 crc kubenswrapper[4749]: I0128 18:57:18.330378 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7d8jv"] Jan 28 18:57:18 crc kubenswrapper[4749]: I0128 18:57:18.338421 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7d8jv"] Jan 28 18:57:18 crc kubenswrapper[4749]: I0128 18:57:18.882673 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8903650b-b615-47c3-95e7-033bba3b379a" path="/var/lib/kubelet/pods/8903650b-b615-47c3-95e7-033bba3b379a/volumes" Jan 28 18:57:19 crc kubenswrapper[4749]: I0128 18:57:19.585455 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 28 18:57:19 crc kubenswrapper[4749]: I0128 18:57:19.603038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.594614 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-85s7b"] Jan 28 18:57:21 crc kubenswrapper[4749]: E0128 18:57:21.597474 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8903650b-b615-47c3-95e7-033bba3b379a" containerName="init" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.597506 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8903650b-b615-47c3-95e7-033bba3b379a" containerName="init" Jan 28 18:57:21 crc kubenswrapper[4749]: E0128 18:57:21.597525 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" containerName="dnsmasq-dns" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.597533 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" containerName="dnsmasq-dns" Jan 28 18:57:21 crc kubenswrapper[4749]: E0128 18:57:21.597583 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" containerName="init" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.597591 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" containerName="init" Jan 28 18:57:21 crc kubenswrapper[4749]: E0128 18:57:21.597601 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8903650b-b615-47c3-95e7-033bba3b379a" containerName="dnsmasq-dns" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.597608 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8903650b-b615-47c3-95e7-033bba3b379a" containerName="dnsmasq-dns" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.597880 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="34fcb22d-0bcc-4950-a04f-bcd402c0a591" containerName="dnsmasq-dns" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.597920 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8903650b-b615-47c3-95e7-033bba3b379a" containerName="dnsmasq-dns" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.599072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.610664 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d98d-account-create-update-bwvmb"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.622113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.625009 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.681962 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d98d-account-create-update-bwvmb"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.690476 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2wdfv"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.692177 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.724760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522h7\" (UniqueName: \"kubernetes.io/projected/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-kube-api-access-522h7\") pod \"cinder-db-create-85s7b\" (UID: \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\") " pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.724830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-operator-scripts\") pod \"cinder-db-create-85s7b\" (UID: \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\") " pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.724915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf7hp\" (UniqueName: \"kubernetes.io/projected/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-kube-api-access-gf7hp\") pod \"barbican-d98d-account-create-update-bwvmb\" (UID: \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\") " pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.724989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-operator-scripts\") pod \"barbican-d98d-account-create-update-bwvmb\" (UID: \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\") " pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.736522 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-85s7b"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.748385 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2wdfv"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.758999 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4db7-account-create-update-d7fz2"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.760499 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.762468 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.796615 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4db7-account-create-update-d7fz2"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.809177 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-9zdr2"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.811447 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.829770 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-9zdr2"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.832473 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jm6c\" (UniqueName: \"kubernetes.io/projected/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-kube-api-access-5jm6c\") pod \"cinder-4db7-account-create-update-d7fz2\" (UID: \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\") " pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.832573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-operator-scripts\") pod \"barbican-d98d-account-create-update-bwvmb\" (UID: \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\") " pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.832630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xlhh\" (UniqueName: \"kubernetes.io/projected/c40db02b-0440-4ca9-a0a7-86e18dede584-kube-api-access-5xlhh\") pod \"barbican-db-create-2wdfv\" (UID: \"c40db02b-0440-4ca9-a0a7-86e18dede584\") " pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.832663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-522h7\" (UniqueName: \"kubernetes.io/projected/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-kube-api-access-522h7\") pod \"cinder-db-create-85s7b\" (UID: \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\") " pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.832695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-operator-scripts\") pod \"cinder-4db7-account-create-update-d7fz2\" (UID: \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\") " pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.832728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-operator-scripts\") pod \"cinder-db-create-85s7b\" (UID: \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\") " pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.832828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40db02b-0440-4ca9-a0a7-86e18dede584-operator-scripts\") pod \"barbican-db-create-2wdfv\" (UID: \"c40db02b-0440-4ca9-a0a7-86e18dede584\") " pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.833581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf7hp\" (UniqueName: \"kubernetes.io/projected/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-kube-api-access-gf7hp\") pod \"barbican-d98d-account-create-update-bwvmb\" (UID: \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\") " pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.833839 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-operator-scripts\") pod \"cinder-db-create-85s7b\" (UID: \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\") " pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.833537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-operator-scripts\") pod \"barbican-d98d-account-create-update-bwvmb\" (UID: \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\") " pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.850153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf7hp\" (UniqueName: \"kubernetes.io/projected/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-kube-api-access-gf7hp\") pod \"barbican-d98d-account-create-update-bwvmb\" (UID: \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\") " pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.851112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-522h7\" (UniqueName: \"kubernetes.io/projected/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-kube-api-access-522h7\") pod \"cinder-db-create-85s7b\" (UID: \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\") " pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.907313 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-a2a6-account-create-update-c8ps8"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.909317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.915612 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.933229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.934457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40db02b-0440-4ca9-a0a7-86e18dede584-operator-scripts\") pod \"barbican-db-create-2wdfv\" (UID: \"c40db02b-0440-4ca9-a0a7-86e18dede584\") " pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.934547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jm6c\" (UniqueName: \"kubernetes.io/projected/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-kube-api-access-5jm6c\") pod \"cinder-4db7-account-create-update-d7fz2\" (UID: \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\") " pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.934608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xlhh\" (UniqueName: \"kubernetes.io/projected/c40db02b-0440-4ca9-a0a7-86e18dede584-kube-api-access-5xlhh\") pod \"barbican-db-create-2wdfv\" (UID: \"c40db02b-0440-4ca9-a0a7-86e18dede584\") " pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.934653 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl5c5\" (UniqueName: \"kubernetes.io/projected/55f15fcf-5410-4b63-a8c1-b3e7329763ec-kube-api-access-nl5c5\") pod \"heat-db-create-9zdr2\" (UID: \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\") " pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.934670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f15fcf-5410-4b63-a8c1-b3e7329763ec-operator-scripts\") pod \"heat-db-create-9zdr2\" (UID: \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\") " pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.934692 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-operator-scripts\") pod \"cinder-4db7-account-create-update-d7fz2\" (UID: \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\") " pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.935751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40db02b-0440-4ca9-a0a7-86e18dede584-operator-scripts\") pod \"barbican-db-create-2wdfv\" (UID: \"c40db02b-0440-4ca9-a0a7-86e18dede584\") " pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.936975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-operator-scripts\") pod \"cinder-4db7-account-create-update-d7fz2\" (UID: \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\") " pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.940378 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a2a6-account-create-update-c8ps8"] Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.949563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.958092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xlhh\" (UniqueName: \"kubernetes.io/projected/c40db02b-0440-4ca9-a0a7-86e18dede584-kube-api-access-5xlhh\") pod \"barbican-db-create-2wdfv\" (UID: \"c40db02b-0440-4ca9-a0a7-86e18dede584\") " pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:21 crc kubenswrapper[4749]: I0128 18:57:21.964447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jm6c\" (UniqueName: \"kubernetes.io/projected/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-kube-api-access-5jm6c\") pod \"cinder-4db7-account-create-update-d7fz2\" (UID: \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\") " pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.028127 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-74ad-account-create-update-vc5lw"] Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.029638 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.036916 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.043086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a45c64-e893-493f-803e-46674d56ec70-operator-scripts\") pod \"heat-a2a6-account-create-update-c8ps8\" (UID: \"13a45c64-e893-493f-803e-46674d56ec70\") " pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.043217 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.043774 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f15fcf-5410-4b63-a8c1-b3e7329763ec-operator-scripts\") pod \"heat-db-create-9zdr2\" (UID: \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\") " pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.043826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl5c5\" (UniqueName: \"kubernetes.io/projected/55f15fcf-5410-4b63-a8c1-b3e7329763ec-kube-api-access-nl5c5\") pod \"heat-db-create-9zdr2\" (UID: \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\") " pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.043934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp8b8\" (UniqueName: \"kubernetes.io/projected/13a45c64-e893-493f-803e-46674d56ec70-kube-api-access-tp8b8\") pod \"heat-a2a6-account-create-update-c8ps8\" (UID: \"13a45c64-e893-493f-803e-46674d56ec70\") " pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.058153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f15fcf-5410-4b63-a8c1-b3e7329763ec-operator-scripts\") pod \"heat-db-create-9zdr2\" (UID: \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\") " pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.097668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl5c5\" (UniqueName: \"kubernetes.io/projected/55f15fcf-5410-4b63-a8c1-b3e7329763ec-kube-api-access-nl5c5\") pod \"heat-db-create-9zdr2\" (UID: \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\") " pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.103539 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74ad-account-create-update-vc5lw"] Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.109722 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.138764 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.146490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gt27\" (UniqueName: \"kubernetes.io/projected/0042f276-29d6-4d7c-938d-4ab73a8162a5-kube-api-access-6gt27\") pod \"neutron-74ad-account-create-update-vc5lw\" (UID: \"0042f276-29d6-4d7c-938d-4ab73a8162a5\") " pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.146580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0042f276-29d6-4d7c-938d-4ab73a8162a5-operator-scripts\") pod \"neutron-74ad-account-create-update-vc5lw\" (UID: \"0042f276-29d6-4d7c-938d-4ab73a8162a5\") " pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.146633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp8b8\" (UniqueName: \"kubernetes.io/projected/13a45c64-e893-493f-803e-46674d56ec70-kube-api-access-tp8b8\") pod \"heat-a2a6-account-create-update-c8ps8\" (UID: \"13a45c64-e893-493f-803e-46674d56ec70\") " pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.146769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a45c64-e893-493f-803e-46674d56ec70-operator-scripts\") pod \"heat-a2a6-account-create-update-c8ps8\" (UID: \"13a45c64-e893-493f-803e-46674d56ec70\") " pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.147806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a45c64-e893-493f-803e-46674d56ec70-operator-scripts\") pod \"heat-a2a6-account-create-update-c8ps8\" (UID: \"13a45c64-e893-493f-803e-46674d56ec70\") " pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.161806 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-668s7"] Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.181772 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-668s7"] Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.183976 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-668s7" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.188058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp8b8\" (UniqueName: \"kubernetes.io/projected/13a45c64-e893-493f-803e-46674d56ec70-kube-api-access-tp8b8\") pod \"heat-a2a6-account-create-update-c8ps8\" (UID: \"13a45c64-e893-493f-803e-46674d56ec70\") " pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.210112 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-l6q85"] Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.213463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.217818 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wl54b" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.221222 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.221680 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.225501 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.227055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-l6q85"] Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.248287 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c62q\" (UniqueName: \"kubernetes.io/projected/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-kube-api-access-6c62q\") pod \"neutron-db-create-668s7\" (UID: \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\") " pod="openstack/neutron-db-create-668s7" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.248507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-operator-scripts\") pod \"neutron-db-create-668s7\" (UID: \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\") " pod="openstack/neutron-db-create-668s7" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.248553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gt27\" (UniqueName: \"kubernetes.io/projected/0042f276-29d6-4d7c-938d-4ab73a8162a5-kube-api-access-6gt27\") pod \"neutron-74ad-account-create-update-vc5lw\" (UID: \"0042f276-29d6-4d7c-938d-4ab73a8162a5\") " pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.248619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0042f276-29d6-4d7c-938d-4ab73a8162a5-operator-scripts\") pod \"neutron-74ad-account-create-update-vc5lw\" (UID: \"0042f276-29d6-4d7c-938d-4ab73a8162a5\") " pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.250025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0042f276-29d6-4d7c-938d-4ab73a8162a5-operator-scripts\") pod \"neutron-74ad-account-create-update-vc5lw\" (UID: \"0042f276-29d6-4d7c-938d-4ab73a8162a5\") " pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.279704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gt27\" (UniqueName: \"kubernetes.io/projected/0042f276-29d6-4d7c-938d-4ab73a8162a5-kube-api-access-6gt27\") pod \"neutron-74ad-account-create-update-vc5lw\" (UID: \"0042f276-29d6-4d7c-938d-4ab73a8162a5\") " pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.372097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-operator-scripts\") pod \"neutron-db-create-668s7\" (UID: \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\") " pod="openstack/neutron-db-create-668s7" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.372138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-combined-ca-bundle\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.372246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c62q\" (UniqueName: \"kubernetes.io/projected/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-kube-api-access-6c62q\") pod \"neutron-db-create-668s7\" (UID: \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\") " pod="openstack/neutron-db-create-668s7" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.372275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgg8q\" (UniqueName: \"kubernetes.io/projected/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-kube-api-access-wgg8q\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.372367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-config-data\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.372892 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-operator-scripts\") pod \"neutron-db-create-668s7\" (UID: \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\") " pod="openstack/neutron-db-create-668s7" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.395665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c62q\" (UniqueName: \"kubernetes.io/projected/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-kube-api-access-6c62q\") pod \"neutron-db-create-668s7\" (UID: \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\") " pod="openstack/neutron-db-create-668s7" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.434672 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.460258 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.474806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-combined-ca-bundle\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.474985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgg8q\" (UniqueName: \"kubernetes.io/projected/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-kube-api-access-wgg8q\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.475091 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-config-data\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.480088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-combined-ca-bundle\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.482695 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-config-data\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.502741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgg8q\" (UniqueName: \"kubernetes.io/projected/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-kube-api-access-wgg8q\") pod \"keystone-db-sync-l6q85\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.528224 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-668s7" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.542583 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.564291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d98d-account-create-update-bwvmb"] Jan 28 18:57:22 crc kubenswrapper[4749]: W0128 18:57:22.599679 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93832f13_c8d6_4b51_a99a_0a4a3383d7cc.slice/crio-1fbc76a5f7523178bf559a6147d473eb4e0220936d30b19ddff26d9b5116179a WatchSource:0}: Error finding container 1fbc76a5f7523178bf559a6147d473eb4e0220936d30b19ddff26d9b5116179a: Status 404 returned error can't find the container with id 1fbc76a5f7523178bf559a6147d473eb4e0220936d30b19ddff26d9b5116179a Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.666071 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-85s7b"] Jan 28 18:57:22 crc kubenswrapper[4749]: I0128 18:57:22.800358 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4db7-account-create-update-d7fz2"] Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.009918 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2wdfv"] Jan 28 18:57:23 crc kubenswrapper[4749]: W0128 18:57:23.016076 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40db02b_0440_4ca9_a0a7_86e18dede584.slice/crio-c59a4d94bf55fe7573cd2df17742e0a203276dba811a1b75e8733083b7d88962 WatchSource:0}: Error finding container c59a4d94bf55fe7573cd2df17742e0a203276dba811a1b75e8733083b7d88962: Status 404 returned error can't find the container with id c59a4d94bf55fe7573cd2df17742e0a203276dba811a1b75e8733083b7d88962 Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.025861 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-9zdr2"] Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.186471 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a2a6-account-create-update-c8ps8"] Jan 28 18:57:23 crc kubenswrapper[4749]: W0128 18:57:23.199559 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13a45c64_e893_493f_803e_46674d56ec70.slice/crio-830311cf5ef59c6d541b78b52203d31678279c64a7ed7358e11f3552df9d42b8 WatchSource:0}: Error finding container 830311cf5ef59c6d541b78b52203d31678279c64a7ed7358e11f3552df9d42b8: Status 404 returned error can't find the container with id 830311cf5ef59c6d541b78b52203d31678279c64a7ed7358e11f3552df9d42b8 Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.381802 4749 generic.go:334] "Generic (PLEG): container finished" podID="d4b87adb-b9c0-464c-93e8-e7b8528b8b51" containerID="4e32af488b666924810d28e3e0c7e2000c1b37473ed8d47f32719a69367104f3" exitCode=0 Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.381894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85s7b" event={"ID":"d4b87adb-b9c0-464c-93e8-e7b8528b8b51","Type":"ContainerDied","Data":"4e32af488b666924810d28e3e0c7e2000c1b37473ed8d47f32719a69367104f3"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.381921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85s7b" event={"ID":"d4b87adb-b9c0-464c-93e8-e7b8528b8b51","Type":"ContainerStarted","Data":"5bfbebfd4140d86b04cb49db07bff8e4d3897898a579871954bf94c046cfa652"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.391297 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2wdfv" event={"ID":"c40db02b-0440-4ca9-a0a7-86e18dede584","Type":"ContainerStarted","Data":"9510a383eade54e93a1c4916907ba24ef9118dc4136096d9a2f92f767795d92b"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.391359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2wdfv" event={"ID":"c40db02b-0440-4ca9-a0a7-86e18dede584","Type":"ContainerStarted","Data":"c59a4d94bf55fe7573cd2df17742e0a203276dba811a1b75e8733083b7d88962"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.395142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4db7-account-create-update-d7fz2" event={"ID":"4465bf8f-73d3-4317-b90e-4f6eac8a59c4","Type":"ContainerStarted","Data":"60a8059d263f063b4dd1e964e8fc92e7a692f539e31e1963000aeeb6c32ddf54"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.395190 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4db7-account-create-update-d7fz2" event={"ID":"4465bf8f-73d3-4317-b90e-4f6eac8a59c4","Type":"ContainerStarted","Data":"450729155ad68a5aff60b9edd67b7d23b2c53ee30f32d78aeaccd5145dd5ef6d"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.396739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a2a6-account-create-update-c8ps8" event={"ID":"13a45c64-e893-493f-803e-46674d56ec70","Type":"ContainerStarted","Data":"830311cf5ef59c6d541b78b52203d31678279c64a7ed7358e11f3552df9d42b8"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.404350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9zdr2" event={"ID":"55f15fcf-5410-4b63-a8c1-b3e7329763ec","Type":"ContainerStarted","Data":"5aec11ea2f974d2e68f51a1e7f1188b0fb61ca82c753cb4e70cb721b56c2bda5"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.404392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9zdr2" event={"ID":"55f15fcf-5410-4b63-a8c1-b3e7329763ec","Type":"ContainerStarted","Data":"13197748de4fbaaffc5d98b0d3dad5369131c4d82b07adef3d729b74c535921e"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.421615 4749 generic.go:334] "Generic (PLEG): container finished" podID="93832f13-c8d6-4b51-a99a-0a4a3383d7cc" containerID="c3472472691f8d9c6997cfa00f24c9f20e5ae67194723532afa4dbdfde7f1d01" exitCode=0 Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.421655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d98d-account-create-update-bwvmb" event={"ID":"93832f13-c8d6-4b51-a99a-0a4a3383d7cc","Type":"ContainerDied","Data":"c3472472691f8d9c6997cfa00f24c9f20e5ae67194723532afa4dbdfde7f1d01"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.421680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d98d-account-create-update-bwvmb" event={"ID":"93832f13-c8d6-4b51-a99a-0a4a3383d7cc","Type":"ContainerStarted","Data":"1fbc76a5f7523178bf559a6147d473eb4e0220936d30b19ddff26d9b5116179a"} Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.431035 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-l6q85"] Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.457064 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74ad-account-create-update-vc5lw"] Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.503221 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.506297 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-668s7"] Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.506484 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-2wdfv" podStartSLOduration=2.5064659540000003 podStartE2EDuration="2.506465954s" podCreationTimestamp="2026-01-28 18:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:23.42988049 +0000 UTC m=+1311.441407265" watchObservedRunningTime="2026-01-28 18:57:23.506465954 +0000 UTC m=+1311.517992729" Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.517627 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.527883 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-4db7-account-create-update-d7fz2" podStartSLOduration=2.5278667649999997 podStartE2EDuration="2.527866765s" podCreationTimestamp="2026-01-28 18:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:23.447856854 +0000 UTC m=+1311.459383629" watchObservedRunningTime="2026-01-28 18:57:23.527866765 +0000 UTC m=+1311.539393540" Jan 28 18:57:23 crc kubenswrapper[4749]: I0128 18:57:23.572550 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-9zdr2" podStartSLOduration=2.572527129 podStartE2EDuration="2.572527129s" podCreationTimestamp="2026-01-28 18:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:23.472459733 +0000 UTC m=+1311.483986508" watchObservedRunningTime="2026-01-28 18:57:23.572527129 +0000 UTC m=+1311.584053904" Jan 28 18:57:24 crc kubenswrapper[4749]: E0128 18:57:24.354641 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc80fe934_a893_4d6e_9ca1_6df8d12dda6a.slice/crio-conmon-a765e42e1a21b9f9541f537d44b145537a7be0d3bc5a98e662dffc8f45af5dfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0042f276_29d6_4d7c_938d_4ab73a8162a5.slice/crio-conmon-57d613d27573952eb2b44fc6f30ea26c59186678a3d0d4ae0d3483d49c996e37.scope\": RecentStats: unable to find data in memory cache]" Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.432996 4749 generic.go:334] "Generic (PLEG): container finished" podID="4465bf8f-73d3-4317-b90e-4f6eac8a59c4" containerID="60a8059d263f063b4dd1e964e8fc92e7a692f539e31e1963000aeeb6c32ddf54" exitCode=0 Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.433089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4db7-account-create-update-d7fz2" event={"ID":"4465bf8f-73d3-4317-b90e-4f6eac8a59c4","Type":"ContainerDied","Data":"60a8059d263f063b4dd1e964e8fc92e7a692f539e31e1963000aeeb6c32ddf54"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.436144 4749 generic.go:334] "Generic (PLEG): container finished" podID="c80fe934-a893-4d6e-9ca1-6df8d12dda6a" containerID="a765e42e1a21b9f9541f537d44b145537a7be0d3bc5a98e662dffc8f45af5dfd" exitCode=0 Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.436315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-668s7" event={"ID":"c80fe934-a893-4d6e-9ca1-6df8d12dda6a","Type":"ContainerDied","Data":"a765e42e1a21b9f9541f537d44b145537a7be0d3bc5a98e662dffc8f45af5dfd"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.436435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-668s7" event={"ID":"c80fe934-a893-4d6e-9ca1-6df8d12dda6a","Type":"ContainerStarted","Data":"68bf1f39217a2dd6dbd76a77c90979ae70d7aa2a1ad5b02a1f9af46d49497b01"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.443201 4749 generic.go:334] "Generic (PLEG): container finished" podID="13a45c64-e893-493f-803e-46674d56ec70" containerID="f4c2e2176cf18698a23209257d01e1598f62e5f1513b2b83cc3c6f63a25591d3" exitCode=0 Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.443288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a2a6-account-create-update-c8ps8" event={"ID":"13a45c64-e893-493f-803e-46674d56ec70","Type":"ContainerDied","Data":"f4c2e2176cf18698a23209257d01e1598f62e5f1513b2b83cc3c6f63a25591d3"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.444869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l6q85" event={"ID":"8e9438aa-0957-4b78-8f0f-a12fb94e86b7","Type":"ContainerStarted","Data":"2592fa7ae339e06c2a8ebf3ca79b508d84afd309536cf52388f8bf8f0d824df7"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.450179 4749 generic.go:334] "Generic (PLEG): container finished" podID="0042f276-29d6-4d7c-938d-4ab73a8162a5" containerID="57d613d27573952eb2b44fc6f30ea26c59186678a3d0d4ae0d3483d49c996e37" exitCode=0 Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.450303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ad-account-create-update-vc5lw" event={"ID":"0042f276-29d6-4d7c-938d-4ab73a8162a5","Type":"ContainerDied","Data":"57d613d27573952eb2b44fc6f30ea26c59186678a3d0d4ae0d3483d49c996e37"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.450405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ad-account-create-update-vc5lw" event={"ID":"0042f276-29d6-4d7c-938d-4ab73a8162a5","Type":"ContainerStarted","Data":"ae1926f797a30e201ff810d55f43701e8c20baf2d0f2834dcba46fe174fe8b26"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.452496 4749 generic.go:334] "Generic (PLEG): container finished" podID="55f15fcf-5410-4b63-a8c1-b3e7329763ec" containerID="5aec11ea2f974d2e68f51a1e7f1188b0fb61ca82c753cb4e70cb721b56c2bda5" exitCode=0 Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.452631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9zdr2" event={"ID":"55f15fcf-5410-4b63-a8c1-b3e7329763ec","Type":"ContainerDied","Data":"5aec11ea2f974d2e68f51a1e7f1188b0fb61ca82c753cb4e70cb721b56c2bda5"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.455241 4749 generic.go:334] "Generic (PLEG): container finished" podID="c40db02b-0440-4ca9-a0a7-86e18dede584" containerID="9510a383eade54e93a1c4916907ba24ef9118dc4136096d9a2f92f767795d92b" exitCode=0 Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.455637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2wdfv" event={"ID":"c40db02b-0440-4ca9-a0a7-86e18dede584","Type":"ContainerDied","Data":"9510a383eade54e93a1c4916907ba24ef9118dc4136096d9a2f92f767795d92b"} Jan 28 18:57:24 crc kubenswrapper[4749]: I0128 18:57:24.463591 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.031191 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.125229 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.130460 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-operator-scripts\") pod \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\" (UID: \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\") " Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.130716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522h7\" (UniqueName: \"kubernetes.io/projected/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-kube-api-access-522h7\") pod \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\" (UID: \"d4b87adb-b9c0-464c-93e8-e7b8528b8b51\") " Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.132149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4b87adb-b9c0-464c-93e8-e7b8528b8b51" (UID: "d4b87adb-b9c0-464c-93e8-e7b8528b8b51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.137479 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-kube-api-access-522h7" (OuterVolumeSpecName: "kube-api-access-522h7") pod "d4b87adb-b9c0-464c-93e8-e7b8528b8b51" (UID: "d4b87adb-b9c0-464c-93e8-e7b8528b8b51"). InnerVolumeSpecName "kube-api-access-522h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.233844 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-operator-scripts\") pod \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\" (UID: \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\") " Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.233961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf7hp\" (UniqueName: \"kubernetes.io/projected/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-kube-api-access-gf7hp\") pod \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\" (UID: \"93832f13-c8d6-4b51-a99a-0a4a3383d7cc\") " Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.234251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93832f13-c8d6-4b51-a99a-0a4a3383d7cc" (UID: "93832f13-c8d6-4b51-a99a-0a4a3383d7cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.234731 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.234748 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.234767 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-522h7\" (UniqueName: \"kubernetes.io/projected/d4b87adb-b9c0-464c-93e8-e7b8528b8b51-kube-api-access-522h7\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.236863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-kube-api-access-gf7hp" (OuterVolumeSpecName: "kube-api-access-gf7hp") pod "93832f13-c8d6-4b51-a99a-0a4a3383d7cc" (UID: "93832f13-c8d6-4b51-a99a-0a4a3383d7cc"). InnerVolumeSpecName "kube-api-access-gf7hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.337389 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf7hp\" (UniqueName: \"kubernetes.io/projected/93832f13-c8d6-4b51-a99a-0a4a3383d7cc-kube-api-access-gf7hp\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.465152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d98d-account-create-update-bwvmb" event={"ID":"93832f13-c8d6-4b51-a99a-0a4a3383d7cc","Type":"ContainerDied","Data":"1fbc76a5f7523178bf559a6147d473eb4e0220936d30b19ddff26d9b5116179a"} Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.465212 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d98d-account-create-update-bwvmb" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.465411 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fbc76a5f7523178bf559a6147d473eb4e0220936d30b19ddff26d9b5116179a" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.467025 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85s7b" Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.467099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85s7b" event={"ID":"d4b87adb-b9c0-464c-93e8-e7b8528b8b51","Type":"ContainerDied","Data":"5bfbebfd4140d86b04cb49db07bff8e4d3897898a579871954bf94c046cfa652"} Jan 28 18:57:25 crc kubenswrapper[4749]: I0128 18:57:25.467141 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bfbebfd4140d86b04cb49db07bff8e4d3897898a579871954bf94c046cfa652" Jan 28 18:57:27 crc kubenswrapper[4749]: I0128 18:57:27.467384 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:57:27 crc kubenswrapper[4749]: I0128 18:57:27.467972 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.418963 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.427265 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.447926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40db02b-0440-4ca9-a0a7-86e18dede584-operator-scripts\") pod \"c40db02b-0440-4ca9-a0a7-86e18dede584\" (UID: \"c40db02b-0440-4ca9-a0a7-86e18dede584\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.448287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0042f276-29d6-4d7c-938d-4ab73a8162a5-operator-scripts\") pod \"0042f276-29d6-4d7c-938d-4ab73a8162a5\" (UID: \"0042f276-29d6-4d7c-938d-4ab73a8162a5\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.448500 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xlhh\" (UniqueName: \"kubernetes.io/projected/c40db02b-0440-4ca9-a0a7-86e18dede584-kube-api-access-5xlhh\") pod \"c40db02b-0440-4ca9-a0a7-86e18dede584\" (UID: \"c40db02b-0440-4ca9-a0a7-86e18dede584\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.448589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gt27\" (UniqueName: \"kubernetes.io/projected/0042f276-29d6-4d7c-938d-4ab73a8162a5-kube-api-access-6gt27\") pod \"0042f276-29d6-4d7c-938d-4ab73a8162a5\" (UID: \"0042f276-29d6-4d7c-938d-4ab73a8162a5\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.448884 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c40db02b-0440-4ca9-a0a7-86e18dede584-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c40db02b-0440-4ca9-a0a7-86e18dede584" (UID: "c40db02b-0440-4ca9-a0a7-86e18dede584"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.448945 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0042f276-29d6-4d7c-938d-4ab73a8162a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0042f276-29d6-4d7c-938d-4ab73a8162a5" (UID: "0042f276-29d6-4d7c-938d-4ab73a8162a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.449533 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40db02b-0440-4ca9-a0a7-86e18dede584-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.449554 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0042f276-29d6-4d7c-938d-4ab73a8162a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.484890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40db02b-0440-4ca9-a0a7-86e18dede584-kube-api-access-5xlhh" (OuterVolumeSpecName: "kube-api-access-5xlhh") pod "c40db02b-0440-4ca9-a0a7-86e18dede584" (UID: "c40db02b-0440-4ca9-a0a7-86e18dede584"). InnerVolumeSpecName "kube-api-access-5xlhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.485111 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0042f276-29d6-4d7c-938d-4ab73a8162a5-kube-api-access-6gt27" (OuterVolumeSpecName: "kube-api-access-6gt27") pod "0042f276-29d6-4d7c-938d-4ab73a8162a5" (UID: "0042f276-29d6-4d7c-938d-4ab73a8162a5"). InnerVolumeSpecName "kube-api-access-6gt27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.490596 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-668s7" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.529695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9zdr2" event={"ID":"55f15fcf-5410-4b63-a8c1-b3e7329763ec","Type":"ContainerDied","Data":"13197748de4fbaaffc5d98b0d3dad5369131c4d82b07adef3d729b74c535921e"} Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.529740 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13197748de4fbaaffc5d98b0d3dad5369131c4d82b07adef3d729b74c535921e" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.538219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2wdfv" event={"ID":"c40db02b-0440-4ca9-a0a7-86e18dede584","Type":"ContainerDied","Data":"c59a4d94bf55fe7573cd2df17742e0a203276dba811a1b75e8733083b7d88962"} Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.538289 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c59a4d94bf55fe7573cd2df17742e0a203276dba811a1b75e8733083b7d88962" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.538249 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2wdfv" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.539934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4db7-account-create-update-d7fz2" event={"ID":"4465bf8f-73d3-4317-b90e-4f6eac8a59c4","Type":"ContainerDied","Data":"450729155ad68a5aff60b9edd67b7d23b2c53ee30f32d78aeaccd5145dd5ef6d"} Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.539975 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="450729155ad68a5aff60b9edd67b7d23b2c53ee30f32d78aeaccd5145dd5ef6d" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.542414 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-668s7" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.542403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-668s7" event={"ID":"c80fe934-a893-4d6e-9ca1-6df8d12dda6a","Type":"ContainerDied","Data":"68bf1f39217a2dd6dbd76a77c90979ae70d7aa2a1ad5b02a1f9af46d49497b01"} Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.542526 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68bf1f39217a2dd6dbd76a77c90979ae70d7aa2a1ad5b02a1f9af46d49497b01" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.544276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a2a6-account-create-update-c8ps8" event={"ID":"13a45c64-e893-493f-803e-46674d56ec70","Type":"ContainerDied","Data":"830311cf5ef59c6d541b78b52203d31678279c64a7ed7358e11f3552df9d42b8"} Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.544308 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="830311cf5ef59c6d541b78b52203d31678279c64a7ed7358e11f3552df9d42b8" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.545814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ad-account-create-update-vc5lw" event={"ID":"0042f276-29d6-4d7c-938d-4ab73a8162a5","Type":"ContainerDied","Data":"ae1926f797a30e201ff810d55f43701e8c20baf2d0f2834dcba46fe174fe8b26"} Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.545837 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae1926f797a30e201ff810d55f43701e8c20baf2d0f2834dcba46fe174fe8b26" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.545873 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74ad-account-create-update-vc5lw" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.552024 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xlhh\" (UniqueName: \"kubernetes.io/projected/c40db02b-0440-4ca9-a0a7-86e18dede584-kube-api-access-5xlhh\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.552070 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gt27\" (UniqueName: \"kubernetes.io/projected/0042f276-29d6-4d7c-938d-4ab73a8162a5-kube-api-access-6gt27\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.581214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.591711 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.604895 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.654256 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl5c5\" (UniqueName: \"kubernetes.io/projected/55f15fcf-5410-4b63-a8c1-b3e7329763ec-kube-api-access-nl5c5\") pod \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\" (UID: \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.654316 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-operator-scripts\") pod \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\" (UID: \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.654774 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c80fe934-a893-4d6e-9ca1-6df8d12dda6a" (UID: "c80fe934-a893-4d6e-9ca1-6df8d12dda6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.654806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jm6c\" (UniqueName: \"kubernetes.io/projected/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-kube-api-access-5jm6c\") pod \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\" (UID: \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.654901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp8b8\" (UniqueName: \"kubernetes.io/projected/13a45c64-e893-493f-803e-46674d56ec70-kube-api-access-tp8b8\") pod \"13a45c64-e893-493f-803e-46674d56ec70\" (UID: \"13a45c64-e893-493f-803e-46674d56ec70\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.654941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-operator-scripts\") pod \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\" (UID: \"4465bf8f-73d3-4317-b90e-4f6eac8a59c4\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.654970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f15fcf-5410-4b63-a8c1-b3e7329763ec-operator-scripts\") pod \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\" (UID: \"55f15fcf-5410-4b63-a8c1-b3e7329763ec\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.654987 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a45c64-e893-493f-803e-46674d56ec70-operator-scripts\") pod \"13a45c64-e893-493f-803e-46674d56ec70\" (UID: \"13a45c64-e893-493f-803e-46674d56ec70\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.655026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c62q\" (UniqueName: \"kubernetes.io/projected/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-kube-api-access-6c62q\") pod \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\" (UID: \"c80fe934-a893-4d6e-9ca1-6df8d12dda6a\") " Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.655623 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.655751 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55f15fcf-5410-4b63-a8c1-b3e7329763ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55f15fcf-5410-4b63-a8c1-b3e7329763ec" (UID: "55f15fcf-5410-4b63-a8c1-b3e7329763ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.656177 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4465bf8f-73d3-4317-b90e-4f6eac8a59c4" (UID: "4465bf8f-73d3-4317-b90e-4f6eac8a59c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.656666 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a45c64-e893-493f-803e-46674d56ec70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "13a45c64-e893-493f-803e-46674d56ec70" (UID: "13a45c64-e893-493f-803e-46674d56ec70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.658691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-kube-api-access-5jm6c" (OuterVolumeSpecName: "kube-api-access-5jm6c") pod "4465bf8f-73d3-4317-b90e-4f6eac8a59c4" (UID: "4465bf8f-73d3-4317-b90e-4f6eac8a59c4"). InnerVolumeSpecName "kube-api-access-5jm6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.658801 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f15fcf-5410-4b63-a8c1-b3e7329763ec-kube-api-access-nl5c5" (OuterVolumeSpecName: "kube-api-access-nl5c5") pod "55f15fcf-5410-4b63-a8c1-b3e7329763ec" (UID: "55f15fcf-5410-4b63-a8c1-b3e7329763ec"). InnerVolumeSpecName "kube-api-access-nl5c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.659040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a45c64-e893-493f-803e-46674d56ec70-kube-api-access-tp8b8" (OuterVolumeSpecName: "kube-api-access-tp8b8") pod "13a45c64-e893-493f-803e-46674d56ec70" (UID: "13a45c64-e893-493f-803e-46674d56ec70"). InnerVolumeSpecName "kube-api-access-tp8b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.672236 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-kube-api-access-6c62q" (OuterVolumeSpecName: "kube-api-access-6c62q") pod "c80fe934-a893-4d6e-9ca1-6df8d12dda6a" (UID: "c80fe934-a893-4d6e-9ca1-6df8d12dda6a"). InnerVolumeSpecName "kube-api-access-6c62q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.756721 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl5c5\" (UniqueName: \"kubernetes.io/projected/55f15fcf-5410-4b63-a8c1-b3e7329763ec-kube-api-access-nl5c5\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.756764 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jm6c\" (UniqueName: \"kubernetes.io/projected/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-kube-api-access-5jm6c\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.756778 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp8b8\" (UniqueName: \"kubernetes.io/projected/13a45c64-e893-493f-803e-46674d56ec70-kube-api-access-tp8b8\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.756816 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4465bf8f-73d3-4317-b90e-4f6eac8a59c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.756829 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f15fcf-5410-4b63-a8c1-b3e7329763ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.756841 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a45c64-e893-493f-803e-46674d56ec70-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:28 crc kubenswrapper[4749]: I0128 18:57:28.756854 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c62q\" (UniqueName: \"kubernetes.io/projected/c80fe934-a893-4d6e-9ca1-6df8d12dda6a-kube-api-access-6c62q\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:29 crc kubenswrapper[4749]: I0128 18:57:29.556118 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4db7-account-create-update-d7fz2" Jan 28 18:57:29 crc kubenswrapper[4749]: I0128 18:57:29.556532 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a2a6-account-create-update-c8ps8" Jan 28 18:57:29 crc kubenswrapper[4749]: I0128 18:57:29.556510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l6q85" event={"ID":"8e9438aa-0957-4b78-8f0f-a12fb94e86b7","Type":"ContainerStarted","Data":"d849d1211d477ed41314253092f15dd21b0d6a39d765095000aedab6749f6032"} Jan 28 18:57:29 crc kubenswrapper[4749]: I0128 18:57:29.557282 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9zdr2" Jan 28 18:57:29 crc kubenswrapper[4749]: I0128 18:57:29.611961 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-l6q85" podStartSLOduration=2.810372656 podStartE2EDuration="7.611945422s" podCreationTimestamp="2026-01-28 18:57:22 +0000 UTC" firstStartedPulling="2026-01-28 18:57:23.421794229 +0000 UTC m=+1311.433321004" lastFinishedPulling="2026-01-28 18:57:28.223366995 +0000 UTC m=+1316.234893770" observedRunningTime="2026-01-28 18:57:29.600560541 +0000 UTC m=+1317.612087326" watchObservedRunningTime="2026-01-28 18:57:29.611945422 +0000 UTC m=+1317.623472197" Jan 28 18:57:31 crc kubenswrapper[4749]: I0128 18:57:31.576359 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e9438aa-0957-4b78-8f0f-a12fb94e86b7" containerID="d849d1211d477ed41314253092f15dd21b0d6a39d765095000aedab6749f6032" exitCode=0 Jan 28 18:57:31 crc kubenswrapper[4749]: I0128 18:57:31.576455 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l6q85" event={"ID":"8e9438aa-0957-4b78-8f0f-a12fb94e86b7","Type":"ContainerDied","Data":"d849d1211d477ed41314253092f15dd21b0d6a39d765095000aedab6749f6032"} Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.017190 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.147807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgg8q\" (UniqueName: \"kubernetes.io/projected/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-kube-api-access-wgg8q\") pod \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.147930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-combined-ca-bundle\") pod \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.148117 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-config-data\") pod \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\" (UID: \"8e9438aa-0957-4b78-8f0f-a12fb94e86b7\") " Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.163094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-kube-api-access-wgg8q" (OuterVolumeSpecName: "kube-api-access-wgg8q") pod "8e9438aa-0957-4b78-8f0f-a12fb94e86b7" (UID: "8e9438aa-0957-4b78-8f0f-a12fb94e86b7"). InnerVolumeSpecName "kube-api-access-wgg8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.178072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e9438aa-0957-4b78-8f0f-a12fb94e86b7" (UID: "8e9438aa-0957-4b78-8f0f-a12fb94e86b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.200619 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-config-data" (OuterVolumeSpecName: "config-data") pod "8e9438aa-0957-4b78-8f0f-a12fb94e86b7" (UID: "8e9438aa-0957-4b78-8f0f-a12fb94e86b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.251241 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.251278 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.251289 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgg8q\" (UniqueName: \"kubernetes.io/projected/8e9438aa-0957-4b78-8f0f-a12fb94e86b7-kube-api-access-wgg8q\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.593986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l6q85" event={"ID":"8e9438aa-0957-4b78-8f0f-a12fb94e86b7","Type":"ContainerDied","Data":"2592fa7ae339e06c2a8ebf3ca79b508d84afd309536cf52388f8bf8f0d824df7"} Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.594034 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2592fa7ae339e06c2a8ebf3ca79b508d84afd309536cf52388f8bf8f0d824df7" Jan 28 18:57:33 crc kubenswrapper[4749]: I0128 18:57:33.594045 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l6q85" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.300469 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j7zb9"] Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301282 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80fe934-a893-4d6e-9ca1-6df8d12dda6a" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301301 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80fe934-a893-4d6e-9ca1-6df8d12dda6a" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301408 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4465bf8f-73d3-4317-b90e-4f6eac8a59c4" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301418 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4465bf8f-73d3-4317-b90e-4f6eac8a59c4" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301446 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0042f276-29d6-4d7c-938d-4ab73a8162a5" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301451 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0042f276-29d6-4d7c-938d-4ab73a8162a5" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301466 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9438aa-0957-4b78-8f0f-a12fb94e86b7" containerName="keystone-db-sync" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301475 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9438aa-0957-4b78-8f0f-a12fb94e86b7" containerName="keystone-db-sync" Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301489 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f15fcf-5410-4b63-a8c1-b3e7329763ec" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301497 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f15fcf-5410-4b63-a8c1-b3e7329763ec" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301506 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a45c64-e893-493f-803e-46674d56ec70" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301512 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a45c64-e893-493f-803e-46674d56ec70" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301525 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b87adb-b9c0-464c-93e8-e7b8528b8b51" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301533 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b87adb-b9c0-464c-93e8-e7b8528b8b51" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301548 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93832f13-c8d6-4b51-a99a-0a4a3383d7cc" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301554 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="93832f13-c8d6-4b51-a99a-0a4a3383d7cc" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: E0128 18:57:34.301568 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40db02b-0440-4ca9-a0a7-86e18dede584" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301573 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40db02b-0440-4ca9-a0a7-86e18dede584" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301778 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f15fcf-5410-4b63-a8c1-b3e7329763ec" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301792 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40db02b-0440-4ca9-a0a7-86e18dede584" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301802 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4465bf8f-73d3-4317-b90e-4f6eac8a59c4" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301810 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a45c64-e893-493f-803e-46674d56ec70" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301826 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c80fe934-a893-4d6e-9ca1-6df8d12dda6a" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301836 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b87adb-b9c0-464c-93e8-e7b8528b8b51" containerName="mariadb-database-create" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301845 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9438aa-0957-4b78-8f0f-a12fb94e86b7" containerName="keystone-db-sync" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301861 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0042f276-29d6-4d7c-938d-4ab73a8162a5" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.301877 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="93832f13-c8d6-4b51-a99a-0a4a3383d7cc" containerName="mariadb-account-create-update" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.305001 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.331668 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j7zb9"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.433493 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zqw4k"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.446302 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.458373 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.458715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.458859 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.458973 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.463367 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wl54b" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.504275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87bbm\" (UniqueName: \"kubernetes.io/projected/cd49fada-001f-41b1-8a55-bf636f345eed-kube-api-access-87bbm\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.504397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.504471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.504498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.504515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-config\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.504537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.521387 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zqw4k"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.576383 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-6mtp7"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.577708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.579967 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-jpk7v" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.584395 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.589388 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-6mtp7"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.624863 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.624976 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-config-data\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.625026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.625266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-config\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h2dg\" (UniqueName: \"kubernetes.io/projected/fb710c26-5706-4ed4-a08d-641315121c9e-kube-api-access-8h2dg\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lzc2\" (UniqueName: \"kubernetes.io/projected/21d60617-f5d4-461a-821c-6e59612876ae-kube-api-access-7lzc2\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626171 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-config-data\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626371 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-fernet-keys\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-credential-keys\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-combined-ca-bundle\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87bbm\" (UniqueName: \"kubernetes.io/projected/cd49fada-001f-41b1-8a55-bf636f345eed-kube-api-access-87bbm\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.626787 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-combined-ca-bundle\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.629152 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-69wsn"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.631562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.634236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.635541 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.636138 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.638876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-config\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.639072 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.639268 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.640020 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sr5z4" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.640361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-scripts\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.640502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.641302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.649837 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-69wsn"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.659772 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dhqq7"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.661555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.672197 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dhqq7"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.682037 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.683525 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nhvmb" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.683743 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.694899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87bbm\" (UniqueName: \"kubernetes.io/projected/cd49fada-001f-41b1-8a55-bf636f345eed-kube-api-access-87bbm\") pod \"dnsmasq-dns-847c4cc679-j7zb9\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.727452 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j7zb9"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.728315 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.741846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-scripts\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.742150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-combined-ca-bundle\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.742176 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-config\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.742198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-config-data\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.742217 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-combined-ca-bundle\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.742242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-combined-ca-bundle\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.742264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9b37dae-71dc-4af8-9229-4f7124bcbb16-etc-machine-id\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.742282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-combined-ca-bundle\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.742313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-scripts\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743542 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-config-data\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h2dg\" (UniqueName: \"kubernetes.io/projected/fb710c26-5706-4ed4-a08d-641315121c9e-kube-api-access-8h2dg\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lzc2\" (UniqueName: \"kubernetes.io/projected/21d60617-f5d4-461a-821c-6e59612876ae-kube-api-access-7lzc2\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743648 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-db-sync-config-data\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-config-data\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzrgk\" (UniqueName: \"kubernetes.io/projected/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-kube-api-access-zzrgk\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-fernet-keys\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743797 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcfp9\" (UniqueName: \"kubernetes.io/projected/e9b37dae-71dc-4af8-9229-4f7124bcbb16-kube-api-access-dcfp9\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.743837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-credential-keys\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.753166 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-credential-keys\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.762742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-config-data\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.767589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-fernet-keys\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.768232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-combined-ca-bundle\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.779945 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-combined-ca-bundle\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.780671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-config-data\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.783445 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-z8gws"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.785041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.786582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-scripts\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.800237 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pjq9l" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.800568 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.800718 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.800790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lzc2\" (UniqueName: \"kubernetes.io/projected/21d60617-f5d4-461a-821c-6e59612876ae-kube-api-access-7lzc2\") pod \"keystone-bootstrap-zqw4k\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.809068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h2dg\" (UniqueName: \"kubernetes.io/projected/fb710c26-5706-4ed4-a08d-641315121c9e-kube-api-access-8h2dg\") pod \"heat-db-sync-6mtp7\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.811713 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-j6wkc"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.812413 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.817602 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.827969 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-kvhbz" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.828281 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.838690 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-z8gws"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-scripts\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-combined-ca-bundle\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-config\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-config-data\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-combined-ca-bundle\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9b37dae-71dc-4af8-9229-4f7124bcbb16-etc-machine-id\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmw2r\" (UniqueName: \"kubernetes.io/projected/e22b0cba-1eee-4244-abe1-20e7694a9813-kube-api-access-qmw2r\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-scripts\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847587 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-combined-ca-bundle\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-db-sync-config-data\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22b0cba-1eee-4244-abe1-20e7694a9813-logs\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-config-data\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847696 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvt8f\" (UniqueName: \"kubernetes.io/projected/9425c299-238f-4293-920f-6ae7ab0c6bb1-kube-api-access-jvt8f\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-combined-ca-bundle\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847746 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-db-sync-config-data\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzrgk\" (UniqueName: \"kubernetes.io/projected/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-kube-api-access-zzrgk\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.847824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcfp9\" (UniqueName: \"kubernetes.io/projected/e9b37dae-71dc-4af8-9229-4f7124bcbb16-kube-api-access-dcfp9\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.849765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9b37dae-71dc-4af8-9229-4f7124bcbb16-etc-machine-id\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.857038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-config\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.866262 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-combined-ca-bundle\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.866495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-combined-ca-bundle\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.873857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-db-sync-config-data\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.889902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-scripts\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.900386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-config-data\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.914809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcfp9\" (UniqueName: \"kubernetes.io/projected/e9b37dae-71dc-4af8-9229-4f7124bcbb16-kube-api-access-dcfp9\") pod \"cinder-db-sync-dhqq7\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.916797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6mtp7" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.922994 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzrgk\" (UniqueName: \"kubernetes.io/projected/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-kube-api-access-zzrgk\") pod \"neutron-db-sync-69wsn\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.936651 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fnmw9"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.949276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmw2r\" (UniqueName: \"kubernetes.io/projected/e22b0cba-1eee-4244-abe1-20e7694a9813-kube-api-access-qmw2r\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.949354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-scripts\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.949377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-combined-ca-bundle\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.949422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-db-sync-config-data\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.949453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22b0cba-1eee-4244-abe1-20e7694a9813-logs\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.949473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-config-data\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.949494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvt8f\" (UniqueName: \"kubernetes.io/projected/9425c299-238f-4293-920f-6ae7ab0c6bb1-kube-api-access-jvt8f\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.949519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-combined-ca-bundle\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.950874 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22b0cba-1eee-4244-abe1-20e7694a9813-logs\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.957733 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-j6wkc"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.957894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-combined-ca-bundle\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.957926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.958848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-scripts\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.959664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-combined-ca-bundle\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.967962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-db-sync-config-data\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.976086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-config-data\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.977012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvt8f\" (UniqueName: \"kubernetes.io/projected/9425c299-238f-4293-920f-6ae7ab0c6bb1-kube-api-access-jvt8f\") pod \"barbican-db-sync-j6wkc\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.980131 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmw2r\" (UniqueName: \"kubernetes.io/projected/e22b0cba-1eee-4244-abe1-20e7694a9813-kube-api-access-qmw2r\") pod \"placement-db-sync-z8gws\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.984672 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fnmw9"] Jan 28 18:57:34 crc kubenswrapper[4749]: I0128 18:57:34.986930 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-69wsn" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.055692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.055756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.056190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9frd\" (UniqueName: \"kubernetes.io/projected/275bf36a-db3c-45a5-8eee-fbb493146648-kube-api-access-z9frd\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.056283 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.056344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-config\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.056441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.061523 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.064109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.068160 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.066265 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.089289 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.117428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-log-httpd\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-config-data\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163592 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njttk\" (UniqueName: \"kubernetes.io/projected/9177835f-67ce-45ec-9c8c-843e54b25deb-kube-api-access-njttk\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-run-httpd\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-scripts\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163724 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9frd\" (UniqueName: \"kubernetes.io/projected/275bf36a-db3c-45a5-8eee-fbb493146648-kube-api-access-z9frd\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-config\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.163850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.164073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.164134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.164311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.164383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.169835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.170431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-config\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.170970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.171277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.173307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.195284 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z8gws" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.204627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.257836 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9frd\" (UniqueName: \"kubernetes.io/projected/275bf36a-db3c-45a5-8eee-fbb493146648-kube-api-access-z9frd\") pod \"dnsmasq-dns-785d8bcb8c-fnmw9\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.273796 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.278180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-log-httpd\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.278232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njttk\" (UniqueName: \"kubernetes.io/projected/9177835f-67ce-45ec-9c8c-843e54b25deb-kube-api-access-njttk\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.278255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-config-data\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.278281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-run-httpd\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.285035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-log-httpd\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.285336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-run-httpd\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.289239 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-scripts\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.289867 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.289909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.295006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.315659 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-config-data\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.315829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.323459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njttk\" (UniqueName: \"kubernetes.io/projected/9177835f-67ce-45ec-9c8c-843e54b25deb-kube-api-access-njttk\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.342956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-scripts\") pod \"ceilometer-0\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.558078 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.561026 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.564800 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b8x9h" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.573866 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.574092 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.574238 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.609756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-logs\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.609822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.609889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-scripts\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.609920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-config-data\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.610070 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzj6v\" (UniqueName: \"kubernetes.io/projected/154ed954-b63f-438c-87c2-8213ce7fb833-kube-api-access-hzj6v\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.610100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.610130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.610289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.611749 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.618375 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.700957 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.712612 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.716533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.716601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-logs\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.716644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.716715 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-scripts\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.716758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-config-data\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.716848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzj6v\" (UniqueName: \"kubernetes.io/projected/154ed954-b63f-438c-87c2-8213ce7fb833-kube-api-access-hzj6v\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.716894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.716929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.717615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.720642 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.727616 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-logs\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.736811 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.739522 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.739550 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/90c2183eeb61266815926dade9237ac0e2270af237c5c71fcfec8d480d6530dc/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.756202 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.759141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-config-data\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.762801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.830395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-scripts\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.844940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzj6v\" (UniqueName: \"kubernetes.io/projected/154ed954-b63f-438c-87c2-8213ce7fb833-kube-api-access-hzj6v\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.923367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.923428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.923492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9kd2\" (UniqueName: \"kubernetes.io/projected/bb2ece96-d40d-4005-94c7-548d4b8baa30-kube-api-access-m9kd2\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.923524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.923568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.923585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.923613 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.923662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:35 crc kubenswrapper[4749]: I0128 18:57:35.953945 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.026236 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j7zb9"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.038955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.039054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.039235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9kd2\" (UniqueName: \"kubernetes.io/projected/bb2ece96-d40d-4005-94c7-548d4b8baa30-kube-api-access-m9kd2\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.039322 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.039468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.039498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.039553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.039711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.040992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.042462 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.051575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.053426 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.053472 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/71cf6cfdb445cd2c33acb8ae4dd1b724414e52d8c80d6fd9f949055d3c3fc984/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.060610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.060648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.064926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.070569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.084648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9kd2\" (UniqueName: \"kubernetes.io/projected/bb2ece96-d40d-4005-94c7-548d4b8baa30-kube-api-access-m9kd2\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.097485 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-69wsn"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.156369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.216640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.278104 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-6mtp7"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.299791 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zqw4k"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.329213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.616067 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fnmw9"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.635251 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dhqq7"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.718545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-z8gws"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.726663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-69wsn" event={"ID":"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d","Type":"ContainerStarted","Data":"5cd99b761b7c639919edb05e077be3b019720251d15ea49760004e79f83885e2"} Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.746500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-69wsn" event={"ID":"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d","Type":"ContainerStarted","Data":"757a995c78231792b42e8d071b94e5e07e7807bc6629125adf5d8f198ffbe511"} Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.746548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zqw4k" event={"ID":"21d60617-f5d4-461a-821c-6e59612876ae","Type":"ContainerStarted","Data":"83f489209ce842339d56ae7a33c76fa63346bd2ac10f3d69ea7678f351754154"} Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.746559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zqw4k" event={"ID":"21d60617-f5d4-461a-821c-6e59612876ae","Type":"ContainerStarted","Data":"0f43649f7f1753fa4262bfbd4c580702fecb6285e68a02efb4420ff04675abaa"} Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.749490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" event={"ID":"275bf36a-db3c-45a5-8eee-fbb493146648","Type":"ContainerStarted","Data":"f63dc23351315daacdfef92399d4b1700869ed9a64f1bcc1b8a66d78e46afd7f"} Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.769380 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd49fada-001f-41b1-8a55-bf636f345eed" containerID="6d1fe440f7f7271afa502a7577f4b89e96aceaaeb7eae07b6f75880ac4b9d299" exitCode=0 Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.769510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" event={"ID":"cd49fada-001f-41b1-8a55-bf636f345eed","Type":"ContainerDied","Data":"6d1fe440f7f7271afa502a7577f4b89e96aceaaeb7eae07b6f75880ac4b9d299"} Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.769542 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" event={"ID":"cd49fada-001f-41b1-8a55-bf636f345eed","Type":"ContainerStarted","Data":"316d6b38fdb0906d9e0d1feb987ad26396bc376a7dfb35a9e6be1428f1c1a5a6"} Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.785804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-j6wkc"] Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.786405 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-69wsn" podStartSLOduration=2.786393587 podStartE2EDuration="2.786393587s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:36.779533677 +0000 UTC m=+1324.791060452" watchObservedRunningTime="2026-01-28 18:57:36.786393587 +0000 UTC m=+1324.797920372" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.805823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6mtp7" event={"ID":"fb710c26-5706-4ed4-a08d-641315121c9e","Type":"ContainerStarted","Data":"5ba6c01f6bb7e0cfb282fc2f8f7401a1daf61e2205526da3c670ff09159592f6"} Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.813806 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zqw4k" podStartSLOduration=2.813786335 podStartE2EDuration="2.813786335s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:36.806136065 +0000 UTC m=+1324.817662840" watchObservedRunningTime="2026-01-28 18:57:36.813786335 +0000 UTC m=+1324.825313110" Jan 28 18:57:36 crc kubenswrapper[4749]: I0128 18:57:36.844750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dhqq7" event={"ID":"e9b37dae-71dc-4af8-9229-4f7124bcbb16","Type":"ContainerStarted","Data":"d82082391df70777c1598902538b39b6b53cfb42d711e00645e992e40236f66c"} Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.009699 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:57:37 crc kubenswrapper[4749]: W0128 18:57:37.151653 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod154ed954_b63f_438c_87c2_8213ce7fb833.slice/crio-554348c16e63a40dd85d346ce71303438531424b2ff0fde11706cb6eb3fa2531 WatchSource:0}: Error finding container 554348c16e63a40dd85d346ce71303438531424b2ff0fde11706cb6eb3fa2531: Status 404 returned error can't find the container with id 554348c16e63a40dd85d346ce71303438531424b2ff0fde11706cb6eb3fa2531 Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.170270 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.300947 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.425909 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.459573 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:57:37 crc kubenswrapper[4749]: W0128 18:57:37.472244 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb2ece96_d40d_4005_94c7_548d4b8baa30.slice/crio-bb08716b6f416f487b38319c1fda4831452e8f62512071c1704d0b1a575c5ef7 WatchSource:0}: Error finding container bb08716b6f416f487b38319c1fda4831452e8f62512071c1704d0b1a575c5ef7: Status 404 returned error can't find the container with id bb08716b6f416f487b38319c1fda4831452e8f62512071c1704d0b1a575c5ef7 Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.563603 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.667298 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.685222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-svc\") pod \"cd49fada-001f-41b1-8a55-bf636f345eed\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.685665 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-sb\") pod \"cd49fada-001f-41b1-8a55-bf636f345eed\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.685887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-nb\") pod \"cd49fada-001f-41b1-8a55-bf636f345eed\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.685952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-swift-storage-0\") pod \"cd49fada-001f-41b1-8a55-bf636f345eed\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.685979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87bbm\" (UniqueName: \"kubernetes.io/projected/cd49fada-001f-41b1-8a55-bf636f345eed-kube-api-access-87bbm\") pod \"cd49fada-001f-41b1-8a55-bf636f345eed\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.686052 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-config\") pod \"cd49fada-001f-41b1-8a55-bf636f345eed\" (UID: \"cd49fada-001f-41b1-8a55-bf636f345eed\") " Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.703636 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd49fada-001f-41b1-8a55-bf636f345eed-kube-api-access-87bbm" (OuterVolumeSpecName: "kube-api-access-87bbm") pod "cd49fada-001f-41b1-8a55-bf636f345eed" (UID: "cd49fada-001f-41b1-8a55-bf636f345eed"). InnerVolumeSpecName "kube-api-access-87bbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.745511 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cd49fada-001f-41b1-8a55-bf636f345eed" (UID: "cd49fada-001f-41b1-8a55-bf636f345eed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.750959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd49fada-001f-41b1-8a55-bf636f345eed" (UID: "cd49fada-001f-41b1-8a55-bf636f345eed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.770653 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-config" (OuterVolumeSpecName: "config") pod "cd49fada-001f-41b1-8a55-bf636f345eed" (UID: "cd49fada-001f-41b1-8a55-bf636f345eed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.783449 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd49fada-001f-41b1-8a55-bf636f345eed" (UID: "cd49fada-001f-41b1-8a55-bf636f345eed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.791026 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd49fada-001f-41b1-8a55-bf636f345eed" (UID: "cd49fada-001f-41b1-8a55-bf636f345eed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.792714 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.792748 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.792757 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.792768 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87bbm\" (UniqueName: \"kubernetes.io/projected/cd49fada-001f-41b1-8a55-bf636f345eed-kube-api-access-87bbm\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.792781 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.792789 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd49fada-001f-41b1-8a55-bf636f345eed-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.890512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-j6wkc" event={"ID":"9425c299-238f-4293-920f-6ae7ab0c6bb1","Type":"ContainerStarted","Data":"20fc7eadc24c98be396ff5220e8546faa8e4234d3243c44814528ef295b613c4"} Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.893821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" event={"ID":"cd49fada-001f-41b1-8a55-bf636f345eed","Type":"ContainerDied","Data":"316d6b38fdb0906d9e0d1feb987ad26396bc376a7dfb35a9e6be1428f1c1a5a6"} Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.893877 4749 scope.go:117] "RemoveContainer" containerID="6d1fe440f7f7271afa502a7577f4b89e96aceaaeb7eae07b6f75880ac4b9d299" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.894024 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j7zb9" Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.901785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z8gws" event={"ID":"e22b0cba-1eee-4244-abe1-20e7694a9813","Type":"ContainerStarted","Data":"3b95c0a5a31572d396a2ce36a0add5c4b41fea8b7faf0cb5096bfc5406a5979a"} Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.906940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb2ece96-d40d-4005-94c7-548d4b8baa30","Type":"ContainerStarted","Data":"bb08716b6f416f487b38319c1fda4831452e8f62512071c1704d0b1a575c5ef7"} Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.911441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9177835f-67ce-45ec-9c8c-843e54b25deb","Type":"ContainerStarted","Data":"af7272dfb1b09d8cb4d03e6fb1487b3c3c9ea12beb53e859fa93ce2086bf7c0c"} Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.914188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154ed954-b63f-438c-87c2-8213ce7fb833","Type":"ContainerStarted","Data":"554348c16e63a40dd85d346ce71303438531424b2ff0fde11706cb6eb3fa2531"} Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.916859 4749 generic.go:334] "Generic (PLEG): container finished" podID="275bf36a-db3c-45a5-8eee-fbb493146648" containerID="ddd6f4363a05b4af47b00166b29a885fb4ee2950240a9778a2947eb06a55c917" exitCode=0 Jan 28 18:57:37 crc kubenswrapper[4749]: I0128 18:57:37.918821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" event={"ID":"275bf36a-db3c-45a5-8eee-fbb493146648","Type":"ContainerDied","Data":"ddd6f4363a05b4af47b00166b29a885fb4ee2950240a9778a2947eb06a55c917"} Jan 28 18:57:38 crc kubenswrapper[4749]: I0128 18:57:38.184454 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j7zb9"] Jan 28 18:57:38 crc kubenswrapper[4749]: I0128 18:57:38.187343 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j7zb9"] Jan 28 18:57:38 crc kubenswrapper[4749]: I0128 18:57:38.887384 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd49fada-001f-41b1-8a55-bf636f345eed" path="/var/lib/kubelet/pods/cd49fada-001f-41b1-8a55-bf636f345eed/volumes" Jan 28 18:57:38 crc kubenswrapper[4749]: I0128 18:57:38.933927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" event={"ID":"275bf36a-db3c-45a5-8eee-fbb493146648","Type":"ContainerStarted","Data":"a6b191bb88c854644fc560781d94799bb7158fb4409855c799183972da70a81e"} Jan 28 18:57:38 crc kubenswrapper[4749]: I0128 18:57:38.942400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154ed954-b63f-438c-87c2-8213ce7fb833","Type":"ContainerStarted","Data":"527187721aaa442981658c2e7f17bdfb4ec3996975013b0bf4619d4f94f7c4ec"} Jan 28 18:57:40 crc kubenswrapper[4749]: I0128 18:57:39.975942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb2ece96-d40d-4005-94c7-548d4b8baa30","Type":"ContainerStarted","Data":"55842b8585f907e511d5b26f14f0bb879adf5873f05ab0915623b3ec876b463c"} Jan 28 18:57:40 crc kubenswrapper[4749]: I0128 18:57:39.976584 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:40 crc kubenswrapper[4749]: I0128 18:57:40.017057 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" podStartSLOduration=6.017029883 podStartE2EDuration="6.017029883s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:39.999422978 +0000 UTC m=+1328.010949773" watchObservedRunningTime="2026-01-28 18:57:40.017029883 +0000 UTC m=+1328.028556658" Jan 28 18:57:40 crc kubenswrapper[4749]: I0128 18:57:40.993043 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerName="glance-log" containerID="cri-o://55842b8585f907e511d5b26f14f0bb879adf5873f05ab0915623b3ec876b463c" gracePeriod=30 Jan 28 18:57:40 crc kubenswrapper[4749]: I0128 18:57:40.993387 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerName="glance-httpd" containerID="cri-o://7647b4f38cebb779e22deaf625787595407904593a5a396c5390b3164e8a7b0c" gracePeriod=30 Jan 28 18:57:40 crc kubenswrapper[4749]: I0128 18:57:40.993404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb2ece96-d40d-4005-94c7-548d4b8baa30","Type":"ContainerStarted","Data":"7647b4f38cebb779e22deaf625787595407904593a5a396c5390b3164e8a7b0c"} Jan 28 18:57:41 crc kubenswrapper[4749]: I0128 18:57:41.000222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154ed954-b63f-438c-87c2-8213ce7fb833","Type":"ContainerStarted","Data":"fe5fa8a4d930fa586c52dad348820c19bc0d119c9d8842b1c8b01cba7640cedd"} Jan 28 18:57:41 crc kubenswrapper[4749]: I0128 18:57:41.000354 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" containerName="glance-log" containerID="cri-o://527187721aaa442981658c2e7f17bdfb4ec3996975013b0bf4619d4f94f7c4ec" gracePeriod=30 Jan 28 18:57:41 crc kubenswrapper[4749]: I0128 18:57:41.000441 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" containerName="glance-httpd" containerID="cri-o://fe5fa8a4d930fa586c52dad348820c19bc0d119c9d8842b1c8b01cba7640cedd" gracePeriod=30 Jan 28 18:57:41 crc kubenswrapper[4749]: I0128 18:57:41.024710 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.024688263 podStartE2EDuration="7.024688263s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:41.01327816 +0000 UTC m=+1329.024804955" watchObservedRunningTime="2026-01-28 18:57:41.024688263 +0000 UTC m=+1329.036215038" Jan 28 18:57:41 crc kubenswrapper[4749]: I0128 18:57:41.051096 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.051078076 podStartE2EDuration="7.051078076s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:57:41.038974756 +0000 UTC m=+1329.050501561" watchObservedRunningTime="2026-01-28 18:57:41.051078076 +0000 UTC m=+1329.062604851" Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.015020 4749 generic.go:334] "Generic (PLEG): container finished" podID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerID="7647b4f38cebb779e22deaf625787595407904593a5a396c5390b3164e8a7b0c" exitCode=0 Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.015307 4749 generic.go:334] "Generic (PLEG): container finished" podID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerID="55842b8585f907e511d5b26f14f0bb879adf5873f05ab0915623b3ec876b463c" exitCode=143 Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.015102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb2ece96-d40d-4005-94c7-548d4b8baa30","Type":"ContainerDied","Data":"7647b4f38cebb779e22deaf625787595407904593a5a396c5390b3164e8a7b0c"} Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.015366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb2ece96-d40d-4005-94c7-548d4b8baa30","Type":"ContainerDied","Data":"55842b8585f907e511d5b26f14f0bb879adf5873f05ab0915623b3ec876b463c"} Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.017623 4749 generic.go:334] "Generic (PLEG): container finished" podID="154ed954-b63f-438c-87c2-8213ce7fb833" containerID="fe5fa8a4d930fa586c52dad348820c19bc0d119c9d8842b1c8b01cba7640cedd" exitCode=0 Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.017645 4749 generic.go:334] "Generic (PLEG): container finished" podID="154ed954-b63f-438c-87c2-8213ce7fb833" containerID="527187721aaa442981658c2e7f17bdfb4ec3996975013b0bf4619d4f94f7c4ec" exitCode=143 Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.017666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154ed954-b63f-438c-87c2-8213ce7fb833","Type":"ContainerDied","Data":"fe5fa8a4d930fa586c52dad348820c19bc0d119c9d8842b1c8b01cba7640cedd"} Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.017686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154ed954-b63f-438c-87c2-8213ce7fb833","Type":"ContainerDied","Data":"527187721aaa442981658c2e7f17bdfb4ec3996975013b0bf4619d4f94f7c4ec"} Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.968305 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:57:42 crc kubenswrapper[4749]: I0128 18:57:42.979456 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.039238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzj6v\" (UniqueName: \"kubernetes.io/projected/154ed954-b63f-438c-87c2-8213ce7fb833-kube-api-access-hzj6v\") pod \"154ed954-b63f-438c-87c2-8213ce7fb833\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.039706 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-internal-tls-certs\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.039851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-config-data\") pod \"154ed954-b63f-438c-87c2-8213ce7fb833\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.039901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-logs\") pod \"154ed954-b63f-438c-87c2-8213ce7fb833\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.039930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-public-tls-certs\") pod \"154ed954-b63f-438c-87c2-8213ce7fb833\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.039966 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-httpd-run\") pod \"154ed954-b63f-438c-87c2-8213ce7fb833\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.039997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-scripts\") pod \"154ed954-b63f-438c-87c2-8213ce7fb833\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040016 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9kd2\" (UniqueName: \"kubernetes.io/projected/bb2ece96-d40d-4005-94c7-548d4b8baa30-kube-api-access-m9kd2\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040075 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-scripts\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040245 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"154ed954-b63f-438c-87c2-8213ce7fb833\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040286 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-config-data\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040363 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-combined-ca-bundle\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-logs\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-httpd-run\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040421 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-combined-ca-bundle\") pod \"154ed954-b63f-438c-87c2-8213ce7fb833\" (UID: \"154ed954-b63f-438c-87c2-8213ce7fb833\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.040524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.044295 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-logs" (OuterVolumeSpecName: "logs") pod "154ed954-b63f-438c-87c2-8213ce7fb833" (UID: "154ed954-b63f-438c-87c2-8213ce7fb833"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.044318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "154ed954-b63f-438c-87c2-8213ce7fb833" (UID: "154ed954-b63f-438c-87c2-8213ce7fb833"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.046068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb2ece96-d40d-4005-94c7-548d4b8baa30","Type":"ContainerDied","Data":"bb08716b6f416f487b38319c1fda4831452e8f62512071c1704d0b1a575c5ef7"} Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.048541 4749 scope.go:117] "RemoveContainer" containerID="7647b4f38cebb779e22deaf625787595407904593a5a396c5390b3164e8a7b0c" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.048908 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.055701 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-logs" (OuterVolumeSpecName: "logs") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.055989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.056191 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2ece96-d40d-4005-94c7-548d4b8baa30-kube-api-access-m9kd2" (OuterVolumeSpecName: "kube-api-access-m9kd2") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30"). InnerVolumeSpecName "kube-api-access-m9kd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.056196 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154ed954-b63f-438c-87c2-8213ce7fb833-kube-api-access-hzj6v" (OuterVolumeSpecName: "kube-api-access-hzj6v") pod "154ed954-b63f-438c-87c2-8213ce7fb833" (UID: "154ed954-b63f-438c-87c2-8213ce7fb833"). InnerVolumeSpecName "kube-api-access-hzj6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.058334 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154ed954-b63f-438c-87c2-8213ce7fb833","Type":"ContainerDied","Data":"554348c16e63a40dd85d346ce71303438531424b2ff0fde11706cb6eb3fa2531"} Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.058575 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.062230 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-scripts" (OuterVolumeSpecName: "scripts") pod "154ed954-b63f-438c-87c2-8213ce7fb833" (UID: "154ed954-b63f-438c-87c2-8213ce7fb833"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.063552 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-scripts" (OuterVolumeSpecName: "scripts") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.072750 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be" (OuterVolumeSpecName: "glance") pod "154ed954-b63f-438c-87c2-8213ce7fb833" (UID: "154ed954-b63f-438c-87c2-8213ce7fb833"). InnerVolumeSpecName "pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: E0128 18:57:43.083814 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63 podName:bb2ece96-d40d-4005-94c7-548d4b8baa30 nodeName:}" failed. No retries permitted until 2026-01-28 18:57:43.583788615 +0000 UTC m=+1331.595315400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.090262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.106519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "154ed954-b63f-438c-87c2-8213ce7fb833" (UID: "154ed954-b63f-438c-87c2-8213ce7fb833"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.118615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.122066 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "154ed954-b63f-438c-87c2-8213ce7fb833" (UID: "154ed954-b63f-438c-87c2-8213ce7fb833"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.123970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-config-data" (OuterVolumeSpecName: "config-data") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.143277 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-logs\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.143544 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.143604 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154ed954-b63f-438c-87c2-8213ce7fb833-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.143997 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144057 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9kd2\" (UniqueName: \"kubernetes.io/projected/bb2ece96-d40d-4005-94c7-548d4b8baa30-kube-api-access-m9kd2\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144138 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144219 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") on node \"crc\" " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144303 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144384 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-logs\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144443 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb2ece96-d40d-4005-94c7-548d4b8baa30-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144498 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144579 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144662 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzj6v\" (UniqueName: \"kubernetes.io/projected/154ed954-b63f-438c-87c2-8213ce7fb833-kube-api-access-hzj6v\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.144725 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2ece96-d40d-4005-94c7-548d4b8baa30-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.152417 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-config-data" (OuterVolumeSpecName: "config-data") pod "154ed954-b63f-438c-87c2-8213ce7fb833" (UID: "154ed954-b63f-438c-87c2-8213ce7fb833"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.177257 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.177427 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be") on node "crc" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.247282 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154ed954-b63f-438c-87c2-8213ce7fb833-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.247318 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.412371 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.426621 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.445420 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:57:43 crc kubenswrapper[4749]: E0128 18:57:43.445950 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd49fada-001f-41b1-8a55-bf636f345eed" containerName="init" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.445972 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd49fada-001f-41b1-8a55-bf636f345eed" containerName="init" Jan 28 18:57:43 crc kubenswrapper[4749]: E0128 18:57:43.446052 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerName="glance-log" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446060 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerName="glance-log" Jan 28 18:57:43 crc kubenswrapper[4749]: E0128 18:57:43.446075 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerName="glance-httpd" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446083 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerName="glance-httpd" Jan 28 18:57:43 crc kubenswrapper[4749]: E0128 18:57:43.446101 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" containerName="glance-log" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446107 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" containerName="glance-log" Jan 28 18:57:43 crc kubenswrapper[4749]: E0128 18:57:43.446116 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" containerName="glance-httpd" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446122 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" containerName="glance-httpd" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446322 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" containerName="glance-httpd" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446399 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd49fada-001f-41b1-8a55-bf636f345eed" containerName="init" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446422 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" containerName="glance-log" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446433 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerName="glance-httpd" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.446448 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" containerName="glance-log" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.450409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.452739 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.453215 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.460693 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.553782 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.554106 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz5zn\" (UniqueName: \"kubernetes.io/projected/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-kube-api-access-bz5zn\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.554256 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.554480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-logs\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.554572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.554688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.554774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.555017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.657295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"bb2ece96-d40d-4005-94c7-548d4b8baa30\" (UID: \"bb2ece96-d40d-4005-94c7-548d4b8baa30\") " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.658427 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.658709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.659653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz5zn\" (UniqueName: \"kubernetes.io/projected/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-kube-api-access-bz5zn\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.659847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.659936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-logs\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.660042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.660150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.660235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.660877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.661192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-logs\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.662674 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.663136 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/90c2183eeb61266815926dade9237ac0e2270af237c5c71fcfec8d480d6530dc/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.667009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.667012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.667218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.670102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.679282 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63" (OuterVolumeSpecName: "glance") pod "bb2ece96-d40d-4005-94c7-548d4b8baa30" (UID: "bb2ece96-d40d-4005-94c7-548d4b8baa30"). InnerVolumeSpecName "pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.689497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz5zn\" (UniqueName: \"kubernetes.io/projected/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-kube-api-access-bz5zn\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.717347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.764948 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") on node \"crc\" " Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.774938 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.801140 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.801296 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63") on node "crc" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.865608 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.875486 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") on node \"crc\" DevicePath \"\"" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.889158 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.908016 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.909871 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.911586 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.914044 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 18:57:43 crc kubenswrapper[4749]: I0128 18:57:43.944674 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.080230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.080319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.080386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.080534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcrjm\" (UniqueName: \"kubernetes.io/projected/6c312272-e82e-4cdc-8501-5b27b63a3ba0-kube-api-access-gcrjm\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.080586 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.080671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.080718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.080765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.182756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcrjm\" (UniqueName: \"kubernetes.io/projected/6c312272-e82e-4cdc-8501-5b27b63a3ba0-kube-api-access-gcrjm\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.182837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.182896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.182943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.182983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.183044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.183094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.183132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.183480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.183740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.185299 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.185442 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/71cf6cfdb445cd2c33acb8ae4dd1b724414e52d8c80d6fd9f949055d3c3fc984/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.187456 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.188157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.188988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.199058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcrjm\" (UniqueName: \"kubernetes.io/projected/6c312272-e82e-4cdc-8501-5b27b63a3ba0-kube-api-access-gcrjm\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.199131 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.234417 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.537534 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.892714 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154ed954-b63f-438c-87c2-8213ce7fb833" path="/var/lib/kubelet/pods/154ed954-b63f-438c-87c2-8213ce7fb833/volumes" Jan 28 18:57:44 crc kubenswrapper[4749]: I0128 18:57:44.893774 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2ece96-d40d-4005-94c7-548d4b8baa30" path="/var/lib/kubelet/pods/bb2ece96-d40d-4005-94c7-548d4b8baa30/volumes" Jan 28 18:57:45 crc kubenswrapper[4749]: I0128 18:57:45.276802 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:57:45 crc kubenswrapper[4749]: I0128 18:57:45.371482 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7qlgx"] Jan 28 18:57:45 crc kubenswrapper[4749]: I0128 18:57:45.371800 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" containerID="cri-o://f0d35a0a68776b9816bde9a279d5e05dd2f57a512425c663cb8942eab26967fb" gracePeriod=10 Jan 28 18:57:46 crc kubenswrapper[4749]: I0128 18:57:46.954436 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Jan 28 18:57:49 crc kubenswrapper[4749]: I0128 18:57:49.126788 4749 generic.go:334] "Generic (PLEG): container finished" podID="32eefe58-baa1-451c-9902-545a2b6e472b" containerID="f0d35a0a68776b9816bde9a279d5e05dd2f57a512425c663cb8942eab26967fb" exitCode=0 Jan 28 18:57:49 crc kubenswrapper[4749]: I0128 18:57:49.126893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" event={"ID":"32eefe58-baa1-451c-9902-545a2b6e472b","Type":"ContainerDied","Data":"f0d35a0a68776b9816bde9a279d5e05dd2f57a512425c663cb8942eab26967fb"} Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.507586 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n5mr8"] Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.511076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.524440 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5mr8"] Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.543351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-utilities\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.543687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnz6j\" (UniqueName: \"kubernetes.io/projected/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-kube-api-access-mnz6j\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.543760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-catalog-content\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.645820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnz6j\" (UniqueName: \"kubernetes.io/projected/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-kube-api-access-mnz6j\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.645874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-catalog-content\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.646057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-utilities\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.646604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-utilities\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.646999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-catalog-content\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.674462 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnz6j\" (UniqueName: \"kubernetes.io/projected/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-kube-api-access-mnz6j\") pod \"redhat-operators-n5mr8\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:55 crc kubenswrapper[4749]: I0128 18:57:55.843531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:57:56 crc kubenswrapper[4749]: I0128 18:57:56.203867 4749 generic.go:334] "Generic (PLEG): container finished" podID="21d60617-f5d4-461a-821c-6e59612876ae" containerID="83f489209ce842339d56ae7a33c76fa63346bd2ac10f3d69ea7678f351754154" exitCode=0 Jan 28 18:57:56 crc kubenswrapper[4749]: I0128 18:57:56.204073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zqw4k" event={"ID":"21d60617-f5d4-461a-821c-6e59612876ae","Type":"ContainerDied","Data":"83f489209ce842339d56ae7a33c76fa63346bd2ac10f3d69ea7678f351754154"} Jan 28 18:57:56 crc kubenswrapper[4749]: E0128 18:57:56.764187 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Jan 28 18:57:56 crc kubenswrapper[4749]: E0128 18:57:56.764394 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmw2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-z8gws_openstack(e22b0cba-1eee-4244-abe1-20e7694a9813): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:57:56 crc kubenswrapper[4749]: E0128 18:57:56.765577 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-z8gws" podUID="e22b0cba-1eee-4244-abe1-20e7694a9813" Jan 28 18:57:56 crc kubenswrapper[4749]: I0128 18:57:56.955109 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: i/o timeout" Jan 28 18:57:57 crc kubenswrapper[4749]: E0128 18:57:57.216258 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-z8gws" podUID="e22b0cba-1eee-4244-abe1-20e7694a9813" Jan 28 18:57:57 crc kubenswrapper[4749]: I0128 18:57:57.467637 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 18:57:57 crc kubenswrapper[4749]: I0128 18:57:57.467720 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 18:57:57 crc kubenswrapper[4749]: I0128 18:57:57.467787 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 18:57:57 crc kubenswrapper[4749]: I0128 18:57:57.468581 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f6753aa33414e3e6ec1468bb7657379fc59db0884f9f3f2e5c921382fe9d6fb"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 18:57:57 crc kubenswrapper[4749]: I0128 18:57:57.468643 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://6f6753aa33414e3e6ec1468bb7657379fc59db0884f9f3f2e5c921382fe9d6fb" gracePeriod=600 Jan 28 18:57:58 crc kubenswrapper[4749]: I0128 18:57:58.246704 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="6f6753aa33414e3e6ec1468bb7657379fc59db0884f9f3f2e5c921382fe9d6fb" exitCode=0 Jan 28 18:57:58 crc kubenswrapper[4749]: I0128 18:57:58.246793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"6f6753aa33414e3e6ec1468bb7657379fc59db0884f9f3f2e5c921382fe9d6fb"} Jan 28 18:58:01 crc kubenswrapper[4749]: I0128 18:58:01.955785 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: i/o timeout" Jan 28 18:58:01 crc kubenswrapper[4749]: I0128 18:58:01.956610 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:58:06 crc kubenswrapper[4749]: I0128 18:58:06.956696 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: i/o timeout" Jan 28 18:58:11 crc kubenswrapper[4749]: I0128 18:58:11.957632 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: i/o timeout" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.278951 4749 scope.go:117] "RemoveContainer" containerID="55842b8585f907e511d5b26f14f0bb879adf5873f05ab0915623b3ec876b463c" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.388710 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.402040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" event={"ID":"32eefe58-baa1-451c-9902-545a2b6e472b","Type":"ContainerDied","Data":"d8a98bcb47fc3e24d159e8663a3b1994c2a315bdadbea4667171a0570edea63f"} Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.402164 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.439539 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-nb\") pod \"32eefe58-baa1-451c-9902-545a2b6e472b\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.439628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-swift-storage-0\") pod \"32eefe58-baa1-451c-9902-545a2b6e472b\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.439694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-config\") pod \"32eefe58-baa1-451c-9902-545a2b6e472b\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.439809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-svc\") pod \"32eefe58-baa1-451c-9902-545a2b6e472b\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.441075 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-sb\") pod \"32eefe58-baa1-451c-9902-545a2b6e472b\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.441439 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdwqj\" (UniqueName: \"kubernetes.io/projected/32eefe58-baa1-451c-9902-545a2b6e472b-kube-api-access-sdwqj\") pod \"32eefe58-baa1-451c-9902-545a2b6e472b\" (UID: \"32eefe58-baa1-451c-9902-545a2b6e472b\") " Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.460744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32eefe58-baa1-451c-9902-545a2b6e472b-kube-api-access-sdwqj" (OuterVolumeSpecName: "kube-api-access-sdwqj") pod "32eefe58-baa1-451c-9902-545a2b6e472b" (UID: "32eefe58-baa1-451c-9902-545a2b6e472b"). InnerVolumeSpecName "kube-api-access-sdwqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.501578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32eefe58-baa1-451c-9902-545a2b6e472b" (UID: "32eefe58-baa1-451c-9902-545a2b6e472b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.526529 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32eefe58-baa1-451c-9902-545a2b6e472b" (UID: "32eefe58-baa1-451c-9902-545a2b6e472b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.532787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-config" (OuterVolumeSpecName: "config") pod "32eefe58-baa1-451c-9902-545a2b6e472b" (UID: "32eefe58-baa1-451c-9902-545a2b6e472b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.546873 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.546915 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.546928 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.546945 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdwqj\" (UniqueName: \"kubernetes.io/projected/32eefe58-baa1-451c-9902-545a2b6e472b-kube-api-access-sdwqj\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.547716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32eefe58-baa1-451c-9902-545a2b6e472b" (UID: "32eefe58-baa1-451c-9902-545a2b6e472b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.548988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32eefe58-baa1-451c-9902-545a2b6e472b" (UID: "32eefe58-baa1-451c-9902-545a2b6e472b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.648895 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.648930 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32eefe58-baa1-451c-9902-545a2b6e472b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.747764 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7qlgx"] Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.761265 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7qlgx"] Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.869170 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:58:12 crc kubenswrapper[4749]: I0128 18:58:12.883088 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" path="/var/lib/kubelet/pods/32eefe58-baa1-451c-9902-545a2b6e472b/volumes" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.378728 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:58:13 crc kubenswrapper[4749]: E0128 18:58:13.418390 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 28 18:58:13 crc kubenswrapper[4749]: E0128 18:58:13.418583 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvt8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-j6wkc_openstack(9425c299-238f-4293-920f-6ae7ab0c6bb1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:58:13 crc kubenswrapper[4749]: E0128 18:58:13.419745 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-j6wkc" podUID="9425c299-238f-4293-920f-6ae7ab0c6bb1" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.431064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zqw4k" event={"ID":"21d60617-f5d4-461a-821c-6e59612876ae","Type":"ContainerDied","Data":"0f43649f7f1753fa4262bfbd4c580702fecb6285e68a02efb4420ff04675abaa"} Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.431101 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f43649f7f1753fa4262bfbd4c580702fecb6285e68a02efb4420ff04675abaa" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.431151 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zqw4k" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.472134 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-credential-keys\") pod \"21d60617-f5d4-461a-821c-6e59612876ae\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.472238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-scripts\") pod \"21d60617-f5d4-461a-821c-6e59612876ae\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.472279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-fernet-keys\") pod \"21d60617-f5d4-461a-821c-6e59612876ae\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.472317 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-config-data\") pod \"21d60617-f5d4-461a-821c-6e59612876ae\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.472418 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lzc2\" (UniqueName: \"kubernetes.io/projected/21d60617-f5d4-461a-821c-6e59612876ae-kube-api-access-7lzc2\") pod \"21d60617-f5d4-461a-821c-6e59612876ae\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.472530 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-combined-ca-bundle\") pod \"21d60617-f5d4-461a-821c-6e59612876ae\" (UID: \"21d60617-f5d4-461a-821c-6e59612876ae\") " Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.483652 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-scripts" (OuterVolumeSpecName: "scripts") pod "21d60617-f5d4-461a-821c-6e59612876ae" (UID: "21d60617-f5d4-461a-821c-6e59612876ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.483670 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "21d60617-f5d4-461a-821c-6e59612876ae" (UID: "21d60617-f5d4-461a-821c-6e59612876ae"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.484089 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d60617-f5d4-461a-821c-6e59612876ae-kube-api-access-7lzc2" (OuterVolumeSpecName: "kube-api-access-7lzc2") pod "21d60617-f5d4-461a-821c-6e59612876ae" (UID: "21d60617-f5d4-461a-821c-6e59612876ae"). InnerVolumeSpecName "kube-api-access-7lzc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.495883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "21d60617-f5d4-461a-821c-6e59612876ae" (UID: "21d60617-f5d4-461a-821c-6e59612876ae"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.527614 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-config-data" (OuterVolumeSpecName: "config-data") pod "21d60617-f5d4-461a-821c-6e59612876ae" (UID: "21d60617-f5d4-461a-821c-6e59612876ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.529293 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21d60617-f5d4-461a-821c-6e59612876ae" (UID: "21d60617-f5d4-461a-821c-6e59612876ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.581633 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.581672 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.581689 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.581701 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.581713 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lzc2\" (UniqueName: \"kubernetes.io/projected/21d60617-f5d4-461a-821c-6e59612876ae-kube-api-access-7lzc2\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:13 crc kubenswrapper[4749]: I0128 18:58:13.581727 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d60617-f5d4-461a-821c-6e59612876ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:14 crc kubenswrapper[4749]: E0128 18:58:14.450923 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-j6wkc" podUID="9425c299-238f-4293-920f-6ae7ab0c6bb1" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.484778 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zqw4k"] Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.496432 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zqw4k"] Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.581266 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bljrg"] Jan 28 18:58:14 crc kubenswrapper[4749]: E0128 18:58:14.581854 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="init" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.581879 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="init" Jan 28 18:58:14 crc kubenswrapper[4749]: E0128 18:58:14.581893 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.581900 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" Jan 28 18:58:14 crc kubenswrapper[4749]: E0128 18:58:14.581910 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d60617-f5d4-461a-821c-6e59612876ae" containerName="keystone-bootstrap" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.581916 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d60617-f5d4-461a-821c-6e59612876ae" containerName="keystone-bootstrap" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.582137 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d60617-f5d4-461a-821c-6e59612876ae" containerName="keystone-bootstrap" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.582156 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.583010 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.585146 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wl54b" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.585670 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.585865 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.587381 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.588260 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.589939 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bljrg"] Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.742509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-scripts\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.742559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ndlh\" (UniqueName: \"kubernetes.io/projected/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-kube-api-access-7ndlh\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.742607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-credential-keys\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.742681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-config-data\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.742724 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-combined-ca-bundle\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.742929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-fernet-keys\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: E0128 18:58:14.801580 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 28 18:58:14 crc kubenswrapper[4749]: E0128 18:58:14.801759 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dcfp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dhqq7_openstack(e9b37dae-71dc-4af8-9229-4f7124bcbb16): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:58:14 crc kubenswrapper[4749]: E0128 18:58:14.803091 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dhqq7" podUID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.844738 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-fernet-keys\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.844848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-scripts\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.844870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ndlh\" (UniqueName: \"kubernetes.io/projected/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-kube-api-access-7ndlh\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.845493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-credential-keys\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.845542 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-config-data\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.845595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-combined-ca-bundle\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.850262 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-fernet-keys\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.850551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-credential-keys\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.850568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-scripts\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.851094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-combined-ca-bundle\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.851674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-config-data\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.860605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ndlh\" (UniqueName: \"kubernetes.io/projected/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-kube-api-access-7ndlh\") pod \"keystone-bootstrap-bljrg\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.889176 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d60617-f5d4-461a-821c-6e59612876ae" path="/var/lib/kubelet/pods/21d60617-f5d4-461a-821c-6e59612876ae/volumes" Jan 28 18:58:14 crc kubenswrapper[4749]: I0128 18:58:14.916157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:15 crc kubenswrapper[4749]: W0128 18:58:15.155510 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c312272_e82e_4cdc_8501_5b27b63a3ba0.slice/crio-1fca54a7a8cce83a7a0a869c188b31c91edf8a4fbdec474357d56abd7a3093d4 WatchSource:0}: Error finding container 1fca54a7a8cce83a7a0a869c188b31c91edf8a4fbdec474357d56abd7a3093d4: Status 404 returned error can't find the container with id 1fca54a7a8cce83a7a0a869c188b31c91edf8a4fbdec474357d56abd7a3093d4 Jan 28 18:58:15 crc kubenswrapper[4749]: I0128 18:58:15.169668 4749 scope.go:117] "RemoveContainer" containerID="fe5fa8a4d930fa586c52dad348820c19bc0d119c9d8842b1c8b01cba7640cedd" Jan 28 18:58:15 crc kubenswrapper[4749]: E0128 18:58:15.287572 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Jan 28 18:58:15 crc kubenswrapper[4749]: E0128 18:58:15.287764 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8h2dg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-6mtp7_openstack(fb710c26-5706-4ed4-a08d-641315121c9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:58:15 crc kubenswrapper[4749]: E0128 18:58:15.289091 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-6mtp7" podUID="fb710c26-5706-4ed4-a08d-641315121c9e" Jan 28 18:58:15 crc kubenswrapper[4749]: I0128 18:58:15.392412 4749 scope.go:117] "RemoveContainer" containerID="527187721aaa442981658c2e7f17bdfb4ec3996975013b0bf4619d4f94f7c4ec" Jan 28 18:58:15 crc kubenswrapper[4749]: I0128 18:58:15.429928 4749 scope.go:117] "RemoveContainer" containerID="ffd739e178035a0a80263ddfa883436d23618669a6ffd6b8554f99da5a12189b" Jan 28 18:58:15 crc kubenswrapper[4749]: I0128 18:58:15.466851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c312272-e82e-4cdc-8501-5b27b63a3ba0","Type":"ContainerStarted","Data":"1fca54a7a8cce83a7a0a869c188b31c91edf8a4fbdec474357d56abd7a3093d4"} Jan 28 18:58:15 crc kubenswrapper[4749]: I0128 18:58:15.658929 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:58:15 crc kubenswrapper[4749]: I0128 18:58:15.740065 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5mr8"] Jan 28 18:58:15 crc kubenswrapper[4749]: I0128 18:58:15.811871 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bljrg"] Jan 28 18:58:16 crc kubenswrapper[4749]: I0128 18:58:16.480666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9177835f-67ce-45ec-9c8c-843e54b25deb","Type":"ContainerStarted","Data":"8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b"} Jan 28 18:58:16 crc kubenswrapper[4749]: I0128 18:58:16.485440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e"} Jan 28 18:58:16 crc kubenswrapper[4749]: E0128 18:58:16.927670 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-6mtp7" podUID="fb710c26-5706-4ed4-a08d-641315121c9e" Jan 28 18:58:16 crc kubenswrapper[4749]: E0128 18:58:16.927788 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-dhqq7" podUID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" Jan 28 18:58:16 crc kubenswrapper[4749]: W0128 18:58:16.950055 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c4707cc_a6c3_4f08_870d_9d8c3a9da583.slice/crio-0c42db171e0f95a210f1c598b45708754b6ed5d18e2da9b9b34d7e420dab63e9 WatchSource:0}: Error finding container 0c42db171e0f95a210f1c598b45708754b6ed5d18e2da9b9b34d7e420dab63e9: Status 404 returned error can't find the container with id 0c42db171e0f95a210f1c598b45708754b6ed5d18e2da9b9b34d7e420dab63e9 Jan 28 18:58:16 crc kubenswrapper[4749]: I0128 18:58:16.958386 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-7qlgx" podUID="32eefe58-baa1-451c-9902-545a2b6e472b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: i/o timeout" Jan 28 18:58:16 crc kubenswrapper[4749]: W0128 18:58:16.958587 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ac6957_d7ec_4aee_9a3e_3b0b3ebbbb0d.slice/crio-e03aec62389f583782afba9ed42dbfa19e024caacab4d34fd363d7b050c01eef WatchSource:0}: Error finding container e03aec62389f583782afba9ed42dbfa19e024caacab4d34fd363d7b050c01eef: Status 404 returned error can't find the container with id e03aec62389f583782afba9ed42dbfa19e024caacab4d34fd363d7b050c01eef Jan 28 18:58:16 crc kubenswrapper[4749]: W0128 18:58:16.970491 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a76dd69_b64f_47d2_bf48_38731cd1b2d7.slice/crio-bc2133f3c4178e3b478caef8338caab544bf6ceed1b966ae9f5430725ad2968a WatchSource:0}: Error finding container bc2133f3c4178e3b478caef8338caab544bf6ceed1b966ae9f5430725ad2968a: Status 404 returned error can't find the container with id bc2133f3c4178e3b478caef8338caab544bf6ceed1b966ae9f5430725ad2968a Jan 28 18:58:17 crc kubenswrapper[4749]: I0128 18:58:17.503847 4749 scope.go:117] "RemoveContainer" containerID="f0d35a0a68776b9816bde9a279d5e05dd2f57a512425c663cb8942eab26967fb" Jan 28 18:58:17 crc kubenswrapper[4749]: I0128 18:58:17.519863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerStarted","Data":"e03aec62389f583782afba9ed42dbfa19e024caacab4d34fd363d7b050c01eef"} Jan 28 18:58:17 crc kubenswrapper[4749]: I0128 18:58:17.525285 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c4707cc-a6c3-4f08-870d-9d8c3a9da583","Type":"ContainerStarted","Data":"0c42db171e0f95a210f1c598b45708754b6ed5d18e2da9b9b34d7e420dab63e9"} Jan 28 18:58:17 crc kubenswrapper[4749]: I0128 18:58:17.526504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c312272-e82e-4cdc-8501-5b27b63a3ba0","Type":"ContainerStarted","Data":"880a7dc39a6a4a6ccdae1fd997c0bd3c477728c52d4b8bf60922ccbc63c7cd91"} Jan 28 18:58:17 crc kubenswrapper[4749]: I0128 18:58:17.551519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bljrg" event={"ID":"2a76dd69-b64f-47d2-bf48-38731cd1b2d7","Type":"ContainerStarted","Data":"bc2133f3c4178e3b478caef8338caab544bf6ceed1b966ae9f5430725ad2968a"} Jan 28 18:58:17 crc kubenswrapper[4749]: I0128 18:58:17.595657 4749 scope.go:117] "RemoveContainer" containerID="b9e2517c4b2feebde90e443652a97f7363ab7cae9c4a3d2863497e77939ce474" Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.587187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c4707cc-a6c3-4f08-870d-9d8c3a9da583","Type":"ContainerStarted","Data":"3c117463354fb01b8ba3165872468641387f0f3541f0ffe39c4e83aaa554b672"} Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.597969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c312272-e82e-4cdc-8501-5b27b63a3ba0","Type":"ContainerStarted","Data":"8224af6ead75455ff44004cb6400c942a48571511762516f155cf6f0ca763836"} Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.607309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bljrg" event={"ID":"2a76dd69-b64f-47d2-bf48-38731cd1b2d7","Type":"ContainerStarted","Data":"4744fe55f05cf7776fdc2efc89c074b120d9340619840321981040e341d474dc"} Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.611633 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerID="5f245ece56dd3fbf29d85f17cb9590383b21f6462ae3111510e207040edb4e9c" exitCode=0 Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.611763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerDied","Data":"5f245ece56dd3fbf29d85f17cb9590383b21f6462ae3111510e207040edb4e9c"} Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.616780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z8gws" event={"ID":"e22b0cba-1eee-4244-abe1-20e7694a9813","Type":"ContainerStarted","Data":"ef710b593bbf63d8356c3aefdd28e05efcaed121a63ba37ed81d06bda125384f"} Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.630912 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=35.630891917 podStartE2EDuration="35.630891917s" podCreationTimestamp="2026-01-28 18:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:58:18.620367376 +0000 UTC m=+1366.631894171" watchObservedRunningTime="2026-01-28 18:58:18.630891917 +0000 UTC m=+1366.642418692" Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.640140 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-z8gws" podStartSLOduration=4.273484252 podStartE2EDuration="44.640126365s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="2026-01-28 18:57:36.749531505 +0000 UTC m=+1324.761058280" lastFinishedPulling="2026-01-28 18:58:17.116173628 +0000 UTC m=+1365.127700393" observedRunningTime="2026-01-28 18:58:18.639749806 +0000 UTC m=+1366.651276591" watchObservedRunningTime="2026-01-28 18:58:18.640126365 +0000 UTC m=+1366.651653140" Jan 28 18:58:18 crc kubenswrapper[4749]: I0128 18:58:18.663794 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bljrg" podStartSLOduration=4.6637716000000005 podStartE2EDuration="4.6637716s" podCreationTimestamp="2026-01-28 18:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:58:18.65645653 +0000 UTC m=+1366.667983305" watchObservedRunningTime="2026-01-28 18:58:18.6637716 +0000 UTC m=+1366.675298375" Jan 28 18:58:19 crc kubenswrapper[4749]: I0128 18:58:19.630284 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c4707cc-a6c3-4f08-870d-9d8c3a9da583","Type":"ContainerStarted","Data":"fe96bafc836b8fe4bd89f61179e16c46df82a02f668b823efbf0da3bd28aebb7"} Jan 28 18:58:20 crc kubenswrapper[4749]: I0128 18:58:20.671040 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=37.671017529 podStartE2EDuration="37.671017529s" podCreationTimestamp="2026-01-28 18:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:58:20.666316883 +0000 UTC m=+1368.677843678" watchObservedRunningTime="2026-01-28 18:58:20.671017529 +0000 UTC m=+1368.682544314" Jan 28 18:58:22 crc kubenswrapper[4749]: I0128 18:58:22.676586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerStarted","Data":"0bbefaa424dab42ca864a3aab11db13f65a5aaf094cc18d7f00ed0441f4def8c"} Jan 28 18:58:22 crc kubenswrapper[4749]: I0128 18:58:22.678580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9177835f-67ce-45ec-9c8c-843e54b25deb","Type":"ContainerStarted","Data":"2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb"} Jan 28 18:58:23 crc kubenswrapper[4749]: I0128 18:58:23.775870 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 18:58:23 crc kubenswrapper[4749]: I0128 18:58:23.777678 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 18:58:23 crc kubenswrapper[4749]: I0128 18:58:23.910495 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 18:58:23 crc kubenswrapper[4749]: I0128 18:58:23.922718 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.539468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.539812 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.576472 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.591669 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.723798 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerID="0bbefaa424dab42ca864a3aab11db13f65a5aaf094cc18d7f00ed0441f4def8c" exitCode=0 Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.723870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerDied","Data":"0bbefaa424dab42ca864a3aab11db13f65a5aaf094cc18d7f00ed0441f4def8c"} Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.724787 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.724818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.726356 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 18:58:24 crc kubenswrapper[4749]: I0128 18:58:24.726381 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 18:58:40 crc kubenswrapper[4749]: I0128 18:58:40.899416 4749 generic.go:334] "Generic (PLEG): container finished" podID="2a76dd69-b64f-47d2-bf48-38731cd1b2d7" containerID="4744fe55f05cf7776fdc2efc89c074b120d9340619840321981040e341d474dc" exitCode=0 Jan 28 18:58:40 crc kubenswrapper[4749]: I0128 18:58:40.899524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bljrg" event={"ID":"2a76dd69-b64f-47d2-bf48-38731cd1b2d7","Type":"ContainerDied","Data":"4744fe55f05cf7776fdc2efc89c074b120d9340619840321981040e341d474dc"} Jan 28 18:58:48 crc kubenswrapper[4749]: E0128 18:58:48.123568 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 28 18:58:48 crc kubenswrapper[4749]: E0128 18:58:48.124621 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dcfp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dhqq7_openstack(e9b37dae-71dc-4af8-9229-4f7124bcbb16): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:58:48 crc kubenswrapper[4749]: E0128 18:58:48.126864 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dhqq7" podUID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.230104 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.313416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-combined-ca-bundle\") pod \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.313510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ndlh\" (UniqueName: \"kubernetes.io/projected/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-kube-api-access-7ndlh\") pod \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.313558 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-scripts\") pod \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.313606 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-fernet-keys\") pod \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.313702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-credential-keys\") pod \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.313817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-config-data\") pod \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\" (UID: \"2a76dd69-b64f-47d2-bf48-38731cd1b2d7\") " Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.321230 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2a76dd69-b64f-47d2-bf48-38731cd1b2d7" (UID: "2a76dd69-b64f-47d2-bf48-38731cd1b2d7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.321360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2a76dd69-b64f-47d2-bf48-38731cd1b2d7" (UID: "2a76dd69-b64f-47d2-bf48-38731cd1b2d7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.321615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-scripts" (OuterVolumeSpecName: "scripts") pod "2a76dd69-b64f-47d2-bf48-38731cd1b2d7" (UID: "2a76dd69-b64f-47d2-bf48-38731cd1b2d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.326699 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-kube-api-access-7ndlh" (OuterVolumeSpecName: "kube-api-access-7ndlh") pod "2a76dd69-b64f-47d2-bf48-38731cd1b2d7" (UID: "2a76dd69-b64f-47d2-bf48-38731cd1b2d7"). InnerVolumeSpecName "kube-api-access-7ndlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.344483 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a76dd69-b64f-47d2-bf48-38731cd1b2d7" (UID: "2a76dd69-b64f-47d2-bf48-38731cd1b2d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.360978 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-config-data" (OuterVolumeSpecName: "config-data") pod "2a76dd69-b64f-47d2-bf48-38731cd1b2d7" (UID: "2a76dd69-b64f-47d2-bf48-38731cd1b2d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.416706 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.416743 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.416754 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.416763 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.416778 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ndlh\" (UniqueName: \"kubernetes.io/projected/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-kube-api-access-7ndlh\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.416787 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a76dd69-b64f-47d2-bf48-38731cd1b2d7-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:48 crc kubenswrapper[4749]: E0128 18:58:48.684747 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Jan 28 18:58:48 crc kubenswrapper[4749]: E0128 18:58:48.684879 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njttk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(9177835f-67ce-45ec-9c8c-843e54b25deb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.975734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bljrg" event={"ID":"2a76dd69-b64f-47d2-bf48-38731cd1b2d7","Type":"ContainerDied","Data":"bc2133f3c4178e3b478caef8338caab544bf6ceed1b966ae9f5430725ad2968a"} Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.976109 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc2133f3c4178e3b478caef8338caab544bf6ceed1b966ae9f5430725ad2968a" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.975944 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bljrg" Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.978095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-j6wkc" event={"ID":"9425c299-238f-4293-920f-6ae7ab0c6bb1","Type":"ContainerStarted","Data":"96ddcecc6d4371edc7a5fe4a62190b4db4ccfab13b78b9050b57062b5b668a46"} Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.981222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerStarted","Data":"4c791ddf7b0951e536637ecce5ac329ade0090d4a43dab367ede7e34c5c4e425"} Jan 28 18:58:48 crc kubenswrapper[4749]: I0128 18:58:48.985219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6mtp7" event={"ID":"fb710c26-5706-4ed4-a08d-641315121c9e","Type":"ContainerStarted","Data":"33ecd0b56caf949988ede8b5f2774a38aaf9aba7ae08cc05cf7510833a868ba8"} Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.036079 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-6mtp7" podStartSLOduration=2.6509466550000003 podStartE2EDuration="1m15.036031591s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="2026-01-28 18:57:36.291661903 +0000 UTC m=+1324.303188678" lastFinishedPulling="2026-01-28 18:58:48.676746849 +0000 UTC m=+1396.688273614" observedRunningTime="2026-01-28 18:58:49.029980911 +0000 UTC m=+1397.041507696" watchObservedRunningTime="2026-01-28 18:58:49.036031591 +0000 UTC m=+1397.047558366" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.043844 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-j6wkc" podStartSLOduration=3.092709617 podStartE2EDuration="1m15.043824084s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="2026-01-28 18:57:36.721135511 +0000 UTC m=+1324.732662286" lastFinishedPulling="2026-01-28 18:58:48.672249978 +0000 UTC m=+1396.683776753" observedRunningTime="2026-01-28 18:58:49.003737692 +0000 UTC m=+1397.015264467" watchObservedRunningTime="2026-01-28 18:58:49.043824084 +0000 UTC m=+1397.055350849" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.085161 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n5mr8" podStartSLOduration=25.855462554 podStartE2EDuration="54.085132976s" podCreationTimestamp="2026-01-28 18:57:55 +0000 UTC" firstStartedPulling="2026-01-28 18:58:19.874538677 +0000 UTC m=+1367.886065452" lastFinishedPulling="2026-01-28 18:58:48.104209099 +0000 UTC m=+1396.115735874" observedRunningTime="2026-01-28 18:58:49.069606972 +0000 UTC m=+1397.081133757" watchObservedRunningTime="2026-01-28 18:58:49.085132976 +0000 UTC m=+1397.096659761" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.370614 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-54bd464d95-gdqhz"] Jan 28 18:58:49 crc kubenswrapper[4749]: E0128 18:58:49.371380 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a76dd69-b64f-47d2-bf48-38731cd1b2d7" containerName="keystone-bootstrap" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.371395 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a76dd69-b64f-47d2-bf48-38731cd1b2d7" containerName="keystone-bootstrap" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.371593 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a76dd69-b64f-47d2-bf48-38731cd1b2d7" containerName="keystone-bootstrap" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.372366 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.374937 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.376577 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.376638 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.376733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.381021 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.381228 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wl54b" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.383120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-54bd464d95-gdqhz"] Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.437863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-public-tls-certs\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.437954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87rpd\" (UniqueName: \"kubernetes.io/projected/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-kube-api-access-87rpd\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.438003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-credential-keys\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.438025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-internal-tls-certs\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.438061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-fernet-keys\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.438084 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-scripts\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.438101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-combined-ca-bundle\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.438123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-config-data\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.539574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87rpd\" (UniqueName: \"kubernetes.io/projected/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-kube-api-access-87rpd\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.539651 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-credential-keys\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.539672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-internal-tls-certs\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.539705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-fernet-keys\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.539727 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-scripts\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.539746 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-combined-ca-bundle\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.539767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-config-data\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.539845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-public-tls-certs\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.545537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-internal-tls-certs\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.554848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-fernet-keys\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.554950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-combined-ca-bundle\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.555557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-scripts\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.556070 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-public-tls-certs\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.559751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-credential-keys\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.564074 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87rpd\" (UniqueName: \"kubernetes.io/projected/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-kube-api-access-87rpd\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.593459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae-config-data\") pod \"keystone-54bd464d95-gdqhz\" (UID: \"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae\") " pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:49 crc kubenswrapper[4749]: I0128 18:58:49.693978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:50 crc kubenswrapper[4749]: I0128 18:58:50.298417 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-54bd464d95-gdqhz"] Jan 28 18:58:50 crc kubenswrapper[4749]: W0128 18:58:50.302237 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad4c5d7_7f28_4a51_9f19_c5cb429f8fae.slice/crio-10d9b93d22ab588c26a5d4a6992a19bd27580239668f63b0ff9b33ceeaabb411 WatchSource:0}: Error finding container 10d9b93d22ab588c26a5d4a6992a19bd27580239668f63b0ff9b33ceeaabb411: Status 404 returned error can't find the container with id 10d9b93d22ab588c26a5d4a6992a19bd27580239668f63b0ff9b33ceeaabb411 Jan 28 18:58:50 crc kubenswrapper[4749]: I0128 18:58:50.982676 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 18:58:50 crc kubenswrapper[4749]: I0128 18:58:50.984835 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 18:58:51 crc kubenswrapper[4749]: I0128 18:58:51.060123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54bd464d95-gdqhz" event={"ID":"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae","Type":"ContainerStarted","Data":"b4bfdf73b6efb5992b3a970c036d61a745a5a244c500ec38381df0c811a757cb"} Jan 28 18:58:51 crc kubenswrapper[4749]: I0128 18:58:51.060456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54bd464d95-gdqhz" event={"ID":"0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae","Type":"ContainerStarted","Data":"10d9b93d22ab588c26a5d4a6992a19bd27580239668f63b0ff9b33ceeaabb411"} Jan 28 18:58:51 crc kubenswrapper[4749]: I0128 18:58:51.061050 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:58:51 crc kubenswrapper[4749]: I0128 18:58:51.103977 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-54bd464d95-gdqhz" podStartSLOduration=2.103953991 podStartE2EDuration="2.103953991s" podCreationTimestamp="2026-01-28 18:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:58:51.086449168 +0000 UTC m=+1399.097975943" watchObservedRunningTime="2026-01-28 18:58:51.103953991 +0000 UTC m=+1399.115480766" Jan 28 18:58:51 crc kubenswrapper[4749]: I0128 18:58:51.240051 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 18:58:51 crc kubenswrapper[4749]: I0128 18:58:51.240162 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 18:58:51 crc kubenswrapper[4749]: I0128 18:58:51.290054 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 18:58:51 crc kubenswrapper[4749]: I0128 18:58:51.520257 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 18:58:52 crc kubenswrapper[4749]: I0128 18:58:52.076676 4749 generic.go:334] "Generic (PLEG): container finished" podID="e22b0cba-1eee-4244-abe1-20e7694a9813" containerID="ef710b593bbf63d8356c3aefdd28e05efcaed121a63ba37ed81d06bda125384f" exitCode=0 Jan 28 18:58:52 crc kubenswrapper[4749]: I0128 18:58:52.076753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z8gws" event={"ID":"e22b0cba-1eee-4244-abe1-20e7694a9813","Type":"ContainerDied","Data":"ef710b593bbf63d8356c3aefdd28e05efcaed121a63ba37ed81d06bda125384f"} Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.528149 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z8gws" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.657582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-scripts\") pod \"e22b0cba-1eee-4244-abe1-20e7694a9813\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.657818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmw2r\" (UniqueName: \"kubernetes.io/projected/e22b0cba-1eee-4244-abe1-20e7694a9813-kube-api-access-qmw2r\") pod \"e22b0cba-1eee-4244-abe1-20e7694a9813\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.658154 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-config-data\") pod \"e22b0cba-1eee-4244-abe1-20e7694a9813\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.659117 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22b0cba-1eee-4244-abe1-20e7694a9813-logs\") pod \"e22b0cba-1eee-4244-abe1-20e7694a9813\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.659159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-combined-ca-bundle\") pod \"e22b0cba-1eee-4244-abe1-20e7694a9813\" (UID: \"e22b0cba-1eee-4244-abe1-20e7694a9813\") " Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.662550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22b0cba-1eee-4244-abe1-20e7694a9813-logs" (OuterVolumeSpecName: "logs") pod "e22b0cba-1eee-4244-abe1-20e7694a9813" (UID: "e22b0cba-1eee-4244-abe1-20e7694a9813"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.665147 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-scripts" (OuterVolumeSpecName: "scripts") pod "e22b0cba-1eee-4244-abe1-20e7694a9813" (UID: "e22b0cba-1eee-4244-abe1-20e7694a9813"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.665724 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.665782 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22b0cba-1eee-4244-abe1-20e7694a9813-logs\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.667453 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22b0cba-1eee-4244-abe1-20e7694a9813-kube-api-access-qmw2r" (OuterVolumeSpecName: "kube-api-access-qmw2r") pod "e22b0cba-1eee-4244-abe1-20e7694a9813" (UID: "e22b0cba-1eee-4244-abe1-20e7694a9813"). InnerVolumeSpecName "kube-api-access-qmw2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.701566 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-config-data" (OuterVolumeSpecName: "config-data") pod "e22b0cba-1eee-4244-abe1-20e7694a9813" (UID: "e22b0cba-1eee-4244-abe1-20e7694a9813"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.734948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e22b0cba-1eee-4244-abe1-20e7694a9813" (UID: "e22b0cba-1eee-4244-abe1-20e7694a9813"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.778255 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmw2r\" (UniqueName: \"kubernetes.io/projected/e22b0cba-1eee-4244-abe1-20e7694a9813-kube-api-access-qmw2r\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.778293 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:53 crc kubenswrapper[4749]: I0128 18:58:53.778303 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22b0cba-1eee-4244-abe1-20e7694a9813-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.099999 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z8gws" event={"ID":"e22b0cba-1eee-4244-abe1-20e7694a9813","Type":"ContainerDied","Data":"3b95c0a5a31572d396a2ce36a0add5c4b41fea8b7faf0cb5096bfc5406a5979a"} Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.100035 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z8gws" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.100038 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b95c0a5a31572d396a2ce36a0add5c4b41fea8b7faf0cb5096bfc5406a5979a" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.211070 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7699ff9496-twrqx"] Jan 28 18:58:54 crc kubenswrapper[4749]: E0128 18:58:54.211720 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22b0cba-1eee-4244-abe1-20e7694a9813" containerName="placement-db-sync" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.211741 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22b0cba-1eee-4244-abe1-20e7694a9813" containerName="placement-db-sync" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.211991 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22b0cba-1eee-4244-abe1-20e7694a9813" containerName="placement-db-sync" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.213510 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.217733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.217775 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.218019 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pjq9l" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.217733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.218051 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.239063 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7699ff9496-twrqx"] Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.292737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-combined-ca-bundle\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.292805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-logs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.292993 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-scripts\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.293092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-public-tls-certs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.293253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrlhw\" (UniqueName: \"kubernetes.io/projected/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-kube-api-access-mrlhw\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.293390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-internal-tls-certs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.293668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-config-data\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.396068 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-combined-ca-bundle\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.396149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-logs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.396209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-scripts\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.396244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-public-tls-certs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.396306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrlhw\" (UniqueName: \"kubernetes.io/projected/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-kube-api-access-mrlhw\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.396372 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-internal-tls-certs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.396470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-config-data\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.397890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-logs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.400865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-scripts\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.401779 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-combined-ca-bundle\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.402580 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-config-data\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.402678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-internal-tls-certs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.403167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-public-tls-certs\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.413155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrlhw\" (UniqueName: \"kubernetes.io/projected/190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1-kube-api-access-mrlhw\") pod \"placement-7699ff9496-twrqx\" (UID: \"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1\") " pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:54 crc kubenswrapper[4749]: I0128 18:58:54.546660 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:58:55 crc kubenswrapper[4749]: I0128 18:58:55.901132 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:58:55 crc kubenswrapper[4749]: I0128 18:58:55.904777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 18:58:57 crc kubenswrapper[4749]: I0128 18:58:57.001399 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 18:58:57 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:58:57 crc kubenswrapper[4749]: > Jan 28 18:58:57 crc kubenswrapper[4749]: I0128 18:58:57.136585 4749 generic.go:334] "Generic (PLEG): container finished" podID="9425c299-238f-4293-920f-6ae7ab0c6bb1" containerID="96ddcecc6d4371edc7a5fe4a62190b4db4ccfab13b78b9050b57062b5b668a46" exitCode=0 Jan 28 18:58:57 crc kubenswrapper[4749]: I0128 18:58:57.136627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-j6wkc" event={"ID":"9425c299-238f-4293-920f-6ae7ab0c6bb1","Type":"ContainerDied","Data":"96ddcecc6d4371edc7a5fe4a62190b4db4ccfab13b78b9050b57062b5b668a46"} Jan 28 18:58:58 crc kubenswrapper[4749]: I0128 18:58:58.788922 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:58:58 crc kubenswrapper[4749]: I0128 18:58:58.966981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-db-sync-config-data\") pod \"9425c299-238f-4293-920f-6ae7ab0c6bb1\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " Jan 28 18:58:58 crc kubenswrapper[4749]: I0128 18:58:58.967073 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvt8f\" (UniqueName: \"kubernetes.io/projected/9425c299-238f-4293-920f-6ae7ab0c6bb1-kube-api-access-jvt8f\") pod \"9425c299-238f-4293-920f-6ae7ab0c6bb1\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " Jan 28 18:58:58 crc kubenswrapper[4749]: I0128 18:58:58.967294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-combined-ca-bundle\") pod \"9425c299-238f-4293-920f-6ae7ab0c6bb1\" (UID: \"9425c299-238f-4293-920f-6ae7ab0c6bb1\") " Jan 28 18:58:58 crc kubenswrapper[4749]: I0128 18:58:58.973243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9425c299-238f-4293-920f-6ae7ab0c6bb1-kube-api-access-jvt8f" (OuterVolumeSpecName: "kube-api-access-jvt8f") pod "9425c299-238f-4293-920f-6ae7ab0c6bb1" (UID: "9425c299-238f-4293-920f-6ae7ab0c6bb1"). InnerVolumeSpecName "kube-api-access-jvt8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:58:58 crc kubenswrapper[4749]: I0128 18:58:58.985256 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9425c299-238f-4293-920f-6ae7ab0c6bb1" (UID: "9425c299-238f-4293-920f-6ae7ab0c6bb1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:58 crc kubenswrapper[4749]: I0128 18:58:58.997390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9425c299-238f-4293-920f-6ae7ab0c6bb1" (UID: "9425c299-238f-4293-920f-6ae7ab0c6bb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.071107 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.071139 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9425c299-238f-4293-920f-6ae7ab0c6bb1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.071149 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvt8f\" (UniqueName: \"kubernetes.io/projected/9425c299-238f-4293-920f-6ae7ab0c6bb1-kube-api-access-jvt8f\") on node \"crc\" DevicePath \"\"" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.105913 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7699ff9496-twrqx"] Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.158837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-j6wkc" event={"ID":"9425c299-238f-4293-920f-6ae7ab0c6bb1","Type":"ContainerDied","Data":"20fc7eadc24c98be396ff5220e8546faa8e4234d3243c44814528ef295b613c4"} Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.158910 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20fc7eadc24c98be396ff5220e8546faa8e4234d3243c44814528ef295b613c4" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.158853 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-j6wkc" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.160120 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7699ff9496-twrqx" event={"ID":"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1","Type":"ContainerStarted","Data":"3e4561e2089157d71747e1677ac408c31883d60fe09ab2e27a4ad49992b92e45"} Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.442858 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7bb49d4f49-jkfrz"] Jan 28 18:58:59 crc kubenswrapper[4749]: E0128 18:58:59.443796 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9425c299-238f-4293-920f-6ae7ab0c6bb1" containerName="barbican-db-sync" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.443822 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9425c299-238f-4293-920f-6ae7ab0c6bb1" containerName="barbican-db-sync" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.444101 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9425c299-238f-4293-920f-6ae7ab0c6bb1" containerName="barbican-db-sync" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.445604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.449947 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-kvhbz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.450231 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.450415 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.468380 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bb49d4f49-jkfrz"] Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.480848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-config-data-custom\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.488344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-logs\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.488414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-combined-ca-bundle\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.488544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmgg4\" (UniqueName: \"kubernetes.io/projected/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-kube-api-access-nmgg4\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.488784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-config-data\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.591307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-config-data\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.591388 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-config-data-custom\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.591480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-logs\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.591500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-combined-ca-bundle\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.591546 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmgg4\" (UniqueName: \"kubernetes.io/projected/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-kube-api-access-nmgg4\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.593835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-logs\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.604139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-combined-ca-bundle\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.604416 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-config-data\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.604453 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-l7sjv"] Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.605621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-config-data-custom\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.606290 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.619436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-l7sjv"] Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.643354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmgg4\" (UniqueName: \"kubernetes.io/projected/3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d-kube-api-access-nmgg4\") pod \"barbican-worker-7bb49d4f49-jkfrz\" (UID: \"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d\") " pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.659396 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-585f7f57cd-7qgls"] Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.662026 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.667264 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.683547 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-585f7f57cd-7qgls"] Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn6wm\" (UniqueName: \"kubernetes.io/projected/591ffff8-803a-414b-b863-e3978dee85ce-kube-api-access-fn6wm\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtmcs\" (UniqueName: \"kubernetes.io/projected/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-kube-api-access-rtmcs\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-config-data\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-combined-ca-bundle\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-config-data-custom\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-config\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591ffff8-803a-414b-b863-e3978dee85ce-logs\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.694847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.764949 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bb49d4f49-jkfrz" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.777542 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8d585fd86-chhqf"] Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.779412 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.781770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.790870 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8d585fd86-chhqf"] Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.796583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbmdr\" (UniqueName: \"kubernetes.io/projected/2ab55a87-e31f-4e2b-91e4-804392c3f90a-kube-api-access-jbmdr\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.796634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591ffff8-803a-414b-b863-e3978dee85ce-logs\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.796662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.796767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-combined-ca-bundle\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.796848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab55a87-e31f-4e2b-91e4-804392c3f90a-logs\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.796879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn6wm\" (UniqueName: \"kubernetes.io/projected/591ffff8-803a-414b-b863-e3978dee85ce-kube-api-access-fn6wm\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.796928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtmcs\" (UniqueName: \"kubernetes.io/projected/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-kube-api-access-rtmcs\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.796994 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-config-data\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-combined-ca-bundle\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797078 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591ffff8-803a-414b-b863-e3978dee85ce-logs\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-config-data-custom\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data-custom\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-config\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797707 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.797982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.798000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.799002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.799558 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-config\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.802582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-combined-ca-bundle\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.806917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-config-data-custom\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.807407 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.812268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591ffff8-803a-414b-b863-e3978dee85ce-config-data\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.823698 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn6wm\" (UniqueName: \"kubernetes.io/projected/591ffff8-803a-414b-b863-e3978dee85ce-kube-api-access-fn6wm\") pod \"barbican-keystone-listener-585f7f57cd-7qgls\" (UID: \"591ffff8-803a-414b-b863-e3978dee85ce\") " pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.831942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtmcs\" (UniqueName: \"kubernetes.io/projected/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-kube-api-access-rtmcs\") pod \"dnsmasq-dns-586bdc5f9-l7sjv\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: E0128 18:58:59.874902 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-dhqq7" podUID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.902947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.903018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbmdr\" (UniqueName: \"kubernetes.io/projected/2ab55a87-e31f-4e2b-91e4-804392c3f90a-kube-api-access-jbmdr\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.903066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-combined-ca-bundle\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.903108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab55a87-e31f-4e2b-91e4-804392c3f90a-logs\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.903339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data-custom\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.907054 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab55a87-e31f-4e2b-91e4-804392c3f90a-logs\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.915319 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data-custom\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.921041 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-combined-ca-bundle\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.921504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbmdr\" (UniqueName: \"kubernetes.io/projected/2ab55a87-e31f-4e2b-91e4-804392c3f90a-kube-api-access-jbmdr\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.951835 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:58:59 crc kubenswrapper[4749]: I0128 18:58:59.954834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data\") pod \"barbican-api-8d585fd86-chhqf\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:59:00 crc kubenswrapper[4749]: I0128 18:59:00.055452 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:59:00 crc kubenswrapper[4749]: I0128 18:59:00.059975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" Jan 28 18:59:00 crc kubenswrapper[4749]: I0128 18:59:00.186812 4749 generic.go:334] "Generic (PLEG): container finished" podID="fb710c26-5706-4ed4-a08d-641315121c9e" containerID="33ecd0b56caf949988ede8b5f2774a38aaf9aba7ae08cc05cf7510833a868ba8" exitCode=0 Jan 28 18:59:00 crc kubenswrapper[4749]: I0128 18:59:00.186864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6mtp7" event={"ID":"fb710c26-5706-4ed4-a08d-641315121c9e","Type":"ContainerDied","Data":"33ecd0b56caf949988ede8b5f2774a38aaf9aba7ae08cc05cf7510833a868ba8"} Jan 28 18:59:00 crc kubenswrapper[4749]: I0128 18:59:00.393656 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bb49d4f49-jkfrz"] Jan 28 18:59:00 crc kubenswrapper[4749]: I0128 18:59:00.567536 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-l7sjv"] Jan 28 18:59:00 crc kubenswrapper[4749]: W0128 18:59:00.627811 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85a7b645_fc7f_4a3a_a7d3_39b0d7248e5c.slice/crio-6e3fc7b90f4ca072ce01a1845d87482ace1ee5d10e2a74794725c3fcd787c0ad WatchSource:0}: Error finding container 6e3fc7b90f4ca072ce01a1845d87482ace1ee5d10e2a74794725c3fcd787c0ad: Status 404 returned error can't find the container with id 6e3fc7b90f4ca072ce01a1845d87482ace1ee5d10e2a74794725c3fcd787c0ad Jan 28 18:59:00 crc kubenswrapper[4749]: I0128 18:59:00.828750 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8d585fd86-chhqf"] Jan 28 18:59:00 crc kubenswrapper[4749]: W0128 18:59:00.850421 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab55a87_e31f_4e2b_91e4_804392c3f90a.slice/crio-7b11ae04cc3c5322d2cc6e2a79d5e2f61e9c70d10b1334a72bb943a10bce064d WatchSource:0}: Error finding container 7b11ae04cc3c5322d2cc6e2a79d5e2f61e9c70d10b1334a72bb943a10bce064d: Status 404 returned error can't find the container with id 7b11ae04cc3c5322d2cc6e2a79d5e2f61e9c70d10b1334a72bb943a10bce064d Jan 28 18:59:00 crc kubenswrapper[4749]: I0128 18:59:00.915026 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-585f7f57cd-7qgls"] Jan 28 18:59:00 crc kubenswrapper[4749]: W0128 18:59:00.939803 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod591ffff8_803a_414b_b863_e3978dee85ce.slice/crio-0057aa30e4dc13d05d9cba951a3c70169d98ffa3dcfccd9fa4c665accb826c5b WatchSource:0}: Error finding container 0057aa30e4dc13d05d9cba951a3c70169d98ffa3dcfccd9fa4c665accb826c5b: Status 404 returned error can't find the container with id 0057aa30e4dc13d05d9cba951a3c70169d98ffa3dcfccd9fa4c665accb826c5b Jan 28 18:59:01 crc kubenswrapper[4749]: E0128 18:59:01.184991 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.212175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8d585fd86-chhqf" event={"ID":"2ab55a87-e31f-4e2b-91e4-804392c3f90a","Type":"ContainerStarted","Data":"df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.212220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8d585fd86-chhqf" event={"ID":"2ab55a87-e31f-4e2b-91e4-804392c3f90a","Type":"ContainerStarted","Data":"7b11ae04cc3c5322d2cc6e2a79d5e2f61e9c70d10b1334a72bb943a10bce064d"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.217714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" event={"ID":"591ffff8-803a-414b-b863-e3978dee85ce","Type":"ContainerStarted","Data":"0057aa30e4dc13d05d9cba951a3c70169d98ffa3dcfccd9fa4c665accb826c5b"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.220945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" event={"ID":"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c","Type":"ContainerStarted","Data":"dc373c830b501a6dce3e2a8d0cda0baded650fde68e4961128391a502b950f24"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.220996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" event={"ID":"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c","Type":"ContainerStarted","Data":"6e3fc7b90f4ca072ce01a1845d87482ace1ee5d10e2a74794725c3fcd787c0ad"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.225051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9177835f-67ce-45ec-9c8c-843e54b25deb","Type":"ContainerStarted","Data":"f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.225237 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="ceilometer-central-agent" containerID="cri-o://8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b" gracePeriod=30 Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.225390 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="ceilometer-notification-agent" containerID="cri-o://2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb" gracePeriod=30 Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.225404 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.225458 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="proxy-httpd" containerID="cri-o://f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25" gracePeriod=30 Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.227163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bb49d4f49-jkfrz" event={"ID":"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d","Type":"ContainerStarted","Data":"52a2e4b59da63302fd3dbca98866e4224cec639ed90b41169834f31ff21c0b33"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.233408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7699ff9496-twrqx" event={"ID":"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1","Type":"ContainerStarted","Data":"6644addaa0e652905a81673215292922f186e4b3df7f7c1c75d574f45b185e0f"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.233716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7699ff9496-twrqx" event={"ID":"190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1","Type":"ContainerStarted","Data":"b10e454a300541732de753362d0a7ea48e5be8286480415c800cf50e2a7b1f4b"} Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.233740 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:59:01 crc kubenswrapper[4749]: I0128 18:59:01.297261 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7699ff9496-twrqx" podStartSLOduration=7.29723877 podStartE2EDuration="7.29723877s" podCreationTimestamp="2026-01-28 18:58:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:01.285773897 +0000 UTC m=+1409.297300682" watchObservedRunningTime="2026-01-28 18:59:01.29723877 +0000 UTC m=+1409.308765545" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.245451 4749 generic.go:334] "Generic (PLEG): container finished" podID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerID="dc373c830b501a6dce3e2a8d0cda0baded650fde68e4961128391a502b950f24" exitCode=0 Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.245543 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" event={"ID":"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c","Type":"ContainerDied","Data":"dc373c830b501a6dce3e2a8d0cda0baded650fde68e4961128391a502b950f24"} Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.245917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" event={"ID":"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c","Type":"ContainerStarted","Data":"1fb8426f821aecf4deb16aef03b8357aaa391d6697d11c2f6bf72f61aafe5a84"} Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.245981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.248748 4749 generic.go:334] "Generic (PLEG): container finished" podID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerID="8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b" exitCode=0 Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.248824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9177835f-67ce-45ec-9c8c-843e54b25deb","Type":"ContainerDied","Data":"8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b"} Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.253836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8d585fd86-chhqf" event={"ID":"2ab55a87-e31f-4e2b-91e4-804392c3f90a","Type":"ContainerStarted","Data":"269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71"} Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.253879 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.253902 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.253922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.266799 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" podStartSLOduration=3.266783515 podStartE2EDuration="3.266783515s" podCreationTimestamp="2026-01-28 18:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:02.265241097 +0000 UTC m=+1410.276767872" watchObservedRunningTime="2026-01-28 18:59:02.266783515 +0000 UTC m=+1410.278310290" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.295744 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8d585fd86-chhqf" podStartSLOduration=3.295720982 podStartE2EDuration="3.295720982s" podCreationTimestamp="2026-01-28 18:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:02.288649597 +0000 UTC m=+1410.300176382" watchObservedRunningTime="2026-01-28 18:59:02.295720982 +0000 UTC m=+1410.307247797" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.808768 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6mtp7" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.890167 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-combined-ca-bundle\") pod \"fb710c26-5706-4ed4-a08d-641315121c9e\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.890622 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-config-data\") pod \"fb710c26-5706-4ed4-a08d-641315121c9e\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.890856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h2dg\" (UniqueName: \"kubernetes.io/projected/fb710c26-5706-4ed4-a08d-641315121c9e-kube-api-access-8h2dg\") pod \"fb710c26-5706-4ed4-a08d-641315121c9e\" (UID: \"fb710c26-5706-4ed4-a08d-641315121c9e\") " Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.903756 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb710c26-5706-4ed4-a08d-641315121c9e-kube-api-access-8h2dg" (OuterVolumeSpecName: "kube-api-access-8h2dg") pod "fb710c26-5706-4ed4-a08d-641315121c9e" (UID: "fb710c26-5706-4ed4-a08d-641315121c9e"). InnerVolumeSpecName "kube-api-access-8h2dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:02 crc kubenswrapper[4749]: I0128 18:59:02.938490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb710c26-5706-4ed4-a08d-641315121c9e" (UID: "fb710c26-5706-4ed4-a08d-641315121c9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.014682 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.015076 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h2dg\" (UniqueName: \"kubernetes.io/projected/fb710c26-5706-4ed4-a08d-641315121c9e-kube-api-access-8h2dg\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.118439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-config-data" (OuterVolumeSpecName: "config-data") pod "fb710c26-5706-4ed4-a08d-641315121c9e" (UID: "fb710c26-5706-4ed4-a08d-641315121c9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.121535 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb710c26-5706-4ed4-a08d-641315121c9e-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.305246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6mtp7" event={"ID":"fb710c26-5706-4ed4-a08d-641315121c9e","Type":"ContainerDied","Data":"5ba6c01f6bb7e0cfb282fc2f8f7401a1daf61e2205526da3c670ff09159592f6"} Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.305289 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ba6c01f6bb7e0cfb282fc2f8f7401a1daf61e2205526da3c670ff09159592f6" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.305391 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6mtp7" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.332121 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5789584db6-28j8v"] Jan 28 18:59:03 crc kubenswrapper[4749]: E0128 18:59:03.332607 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb710c26-5706-4ed4-a08d-641315121c9e" containerName="heat-db-sync" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.332622 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb710c26-5706-4ed4-a08d-641315121c9e" containerName="heat-db-sync" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.332878 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb710c26-5706-4ed4-a08d-641315121c9e" containerName="heat-db-sync" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.334313 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.342092 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.342714 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.355006 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5789584db6-28j8v"] Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.437116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-config-data\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.437164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-public-tls-certs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.437185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd564aa-37ee-4f52-b9c4-931550ef0aed-logs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.437221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-combined-ca-bundle\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.437348 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-config-data-custom\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.437411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-internal-tls-certs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.437544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5v8t\" (UniqueName: \"kubernetes.io/projected/2cd564aa-37ee-4f52-b9c4-931550ef0aed-kube-api-access-q5v8t\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.540175 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-config-data-custom\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.540266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-internal-tls-certs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.540409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5v8t\" (UniqueName: \"kubernetes.io/projected/2cd564aa-37ee-4f52-b9c4-931550ef0aed-kube-api-access-q5v8t\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.540540 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-config-data\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.540578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-public-tls-certs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.540603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd564aa-37ee-4f52-b9c4-931550ef0aed-logs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.540653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-combined-ca-bundle\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.546271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-config-data\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.546984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd564aa-37ee-4f52-b9c4-931550ef0aed-logs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.547115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-internal-tls-certs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.549867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-config-data-custom\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.551479 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-combined-ca-bundle\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.552861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd564aa-37ee-4f52-b9c4-931550ef0aed-public-tls-certs\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.570691 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5v8t\" (UniqueName: \"kubernetes.io/projected/2cd564aa-37ee-4f52-b9c4-931550ef0aed-kube-api-access-q5v8t\") pod \"barbican-api-5789584db6-28j8v\" (UID: \"2cd564aa-37ee-4f52-b9c4-931550ef0aed\") " pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:03 crc kubenswrapper[4749]: I0128 18:59:03.657743 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:04 crc kubenswrapper[4749]: I0128 18:59:04.581244 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5789584db6-28j8v"] Jan 28 18:59:04 crc kubenswrapper[4749]: W0128 18:59:04.583905 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cd564aa_37ee_4f52_b9c4_931550ef0aed.slice/crio-7b89c8be6ee330becaf9ea527ab866316f607c7a765d5d0fa340d4a4912da4cd WatchSource:0}: Error finding container 7b89c8be6ee330becaf9ea527ab866316f607c7a765d5d0fa340d4a4912da4cd: Status 404 returned error can't find the container with id 7b89c8be6ee330becaf9ea527ab866316f607c7a765d5d0fa340d4a4912da4cd Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.369732 4749 generic.go:334] "Generic (PLEG): container finished" podID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerID="2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb" exitCode=0 Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.369810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9177835f-67ce-45ec-9c8c-843e54b25deb","Type":"ContainerDied","Data":"2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb"} Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.372551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bb49d4f49-jkfrz" event={"ID":"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d","Type":"ContainerStarted","Data":"ed2001afa42b8eca815fa48d522c4b6defb16e1f55f481081465b1f69937ed88"} Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.372592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bb49d4f49-jkfrz" event={"ID":"3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d","Type":"ContainerStarted","Data":"a805a0109b2c6e34acec26fef823e14a6d6a8fccd63fe684511146261289bf47"} Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.375002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5789584db6-28j8v" event={"ID":"2cd564aa-37ee-4f52-b9c4-931550ef0aed","Type":"ContainerStarted","Data":"db61f028a0c2922d82aab12aaa2a61b2c9d42e940e73c549767e751cb3819782"} Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.375048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5789584db6-28j8v" event={"ID":"2cd564aa-37ee-4f52-b9c4-931550ef0aed","Type":"ContainerStarted","Data":"8a08d5ef5cac86ae4382a7d2539f9465d7f0616f0580a71a9e17a6104cb30c05"} Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.375061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5789584db6-28j8v" event={"ID":"2cd564aa-37ee-4f52-b9c4-931550ef0aed","Type":"ContainerStarted","Data":"7b89c8be6ee330becaf9ea527ab866316f607c7a765d5d0fa340d4a4912da4cd"} Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.375078 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.375123 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.377846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" event={"ID":"591ffff8-803a-414b-b863-e3978dee85ce","Type":"ContainerStarted","Data":"236f589b12bc350ffd03ba68fb6d98c9be97499588ef499d91bb92563eea99c7"} Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.378036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" event={"ID":"591ffff8-803a-414b-b863-e3978dee85ce","Type":"ContainerStarted","Data":"930057ee192ca1c68f505b5590d6a9beeffac50e60f1f7e8a4a0041b951fc6a0"} Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.398938 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7bb49d4f49-jkfrz" podStartSLOduration=2.728187246 podStartE2EDuration="6.398918444s" podCreationTimestamp="2026-01-28 18:58:59 +0000 UTC" firstStartedPulling="2026-01-28 18:59:00.412209136 +0000 UTC m=+1408.423735911" lastFinishedPulling="2026-01-28 18:59:04.082940334 +0000 UTC m=+1412.094467109" observedRunningTime="2026-01-28 18:59:05.391089471 +0000 UTC m=+1413.402616246" watchObservedRunningTime="2026-01-28 18:59:05.398918444 +0000 UTC m=+1413.410445219" Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.424246 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-585f7f57cd-7qgls" podStartSLOduration=3.285638713 podStartE2EDuration="6.424223231s" podCreationTimestamp="2026-01-28 18:58:59 +0000 UTC" firstStartedPulling="2026-01-28 18:59:00.944265724 +0000 UTC m=+1408.955792499" lastFinishedPulling="2026-01-28 18:59:04.082850252 +0000 UTC m=+1412.094377017" observedRunningTime="2026-01-28 18:59:05.420494359 +0000 UTC m=+1413.432021154" watchObservedRunningTime="2026-01-28 18:59:05.424223231 +0000 UTC m=+1413.435750006" Jan 28 18:59:05 crc kubenswrapper[4749]: I0128 18:59:05.463141 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5789584db6-28j8v" podStartSLOduration=2.463101653 podStartE2EDuration="2.463101653s" podCreationTimestamp="2026-01-28 18:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:05.450367828 +0000 UTC m=+1413.461894653" watchObservedRunningTime="2026-01-28 18:59:05.463101653 +0000 UTC m=+1413.474628428" Jan 28 18:59:06 crc kubenswrapper[4749]: I0128 18:59:06.925640 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 18:59:06 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:59:06 crc kubenswrapper[4749]: > Jan 28 18:59:09 crc kubenswrapper[4749]: I0128 18:59:09.953561 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.043625 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fnmw9"] Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.043904 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" podUID="275bf36a-db3c-45a5-8eee-fbb493146648" containerName="dnsmasq-dns" containerID="cri-o://a6b191bb88c854644fc560781d94799bb7158fb4409855c799183972da70a81e" gracePeriod=10 Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.275147 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" podUID="275bf36a-db3c-45a5-8eee-fbb493146648" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.184:5353: connect: connection refused" Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.430124 4749 generic.go:334] "Generic (PLEG): container finished" podID="275bf36a-db3c-45a5-8eee-fbb493146648" containerID="a6b191bb88c854644fc560781d94799bb7158fb4409855c799183972da70a81e" exitCode=0 Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.430186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" event={"ID":"275bf36a-db3c-45a5-8eee-fbb493146648","Type":"ContainerDied","Data":"a6b191bb88c854644fc560781d94799bb7158fb4409855c799183972da70a81e"} Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.807971 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.866873 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-config\") pod \"275bf36a-db3c-45a5-8eee-fbb493146648\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.942399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-config" (OuterVolumeSpecName: "config") pod "275bf36a-db3c-45a5-8eee-fbb493146648" (UID: "275bf36a-db3c-45a5-8eee-fbb493146648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.969027 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-swift-storage-0\") pod \"275bf36a-db3c-45a5-8eee-fbb493146648\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.969111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-sb\") pod \"275bf36a-db3c-45a5-8eee-fbb493146648\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.969157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-svc\") pod \"275bf36a-db3c-45a5-8eee-fbb493146648\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.969260 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-nb\") pod \"275bf36a-db3c-45a5-8eee-fbb493146648\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.969290 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9frd\" (UniqueName: \"kubernetes.io/projected/275bf36a-db3c-45a5-8eee-fbb493146648-kube-api-access-z9frd\") pod \"275bf36a-db3c-45a5-8eee-fbb493146648\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.969774 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:10 crc kubenswrapper[4749]: I0128 18:59:10.977752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/275bf36a-db3c-45a5-8eee-fbb493146648-kube-api-access-z9frd" (OuterVolumeSpecName: "kube-api-access-z9frd") pod "275bf36a-db3c-45a5-8eee-fbb493146648" (UID: "275bf36a-db3c-45a5-8eee-fbb493146648"). InnerVolumeSpecName "kube-api-access-z9frd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.055001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "275bf36a-db3c-45a5-8eee-fbb493146648" (UID: "275bf36a-db3c-45a5-8eee-fbb493146648"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.063761 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "275bf36a-db3c-45a5-8eee-fbb493146648" (UID: "275bf36a-db3c-45a5-8eee-fbb493146648"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.070506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "275bf36a-db3c-45a5-8eee-fbb493146648" (UID: "275bf36a-db3c-45a5-8eee-fbb493146648"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.072403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-swift-storage-0\") pod \"275bf36a-db3c-45a5-8eee-fbb493146648\" (UID: \"275bf36a-db3c-45a5-8eee-fbb493146648\") " Jan 28 18:59:11 crc kubenswrapper[4749]: W0128 18:59:11.072889 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/275bf36a-db3c-45a5-8eee-fbb493146648/volumes/kubernetes.io~configmap/dns-swift-storage-0 Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.072917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "275bf36a-db3c-45a5-8eee-fbb493146648" (UID: "275bf36a-db3c-45a5-8eee-fbb493146648"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.073410 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.073426 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.073434 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.073442 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9frd\" (UniqueName: \"kubernetes.io/projected/275bf36a-db3c-45a5-8eee-fbb493146648-kube-api-access-z9frd\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.074987 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "275bf36a-db3c-45a5-8eee-fbb493146648" (UID: "275bf36a-db3c-45a5-8eee-fbb493146648"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.175205 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275bf36a-db3c-45a5-8eee-fbb493146648-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.442287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" event={"ID":"275bf36a-db3c-45a5-8eee-fbb493146648","Type":"ContainerDied","Data":"f63dc23351315daacdfef92399d4b1700869ed9a64f1bcc1b8a66d78e46afd7f"} Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.442732 4749 scope.go:117] "RemoveContainer" containerID="a6b191bb88c854644fc560781d94799bb7158fb4409855c799183972da70a81e" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.442396 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fnmw9" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.475016 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fnmw9"] Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.480422 4749 scope.go:117] "RemoveContainer" containerID="ddd6f4363a05b4af47b00166b29a885fb4ee2950240a9778a2947eb06a55c917" Jan 28 18:59:11 crc kubenswrapper[4749]: I0128 18:59:11.484540 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fnmw9"] Jan 28 18:59:12 crc kubenswrapper[4749]: I0128 18:59:12.021559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:59:12 crc kubenswrapper[4749]: I0128 18:59:12.598632 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:59:12 crc kubenswrapper[4749]: I0128 18:59:12.884734 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="275bf36a-db3c-45a5-8eee-fbb493146648" path="/var/lib/kubelet/pods/275bf36a-db3c-45a5-8eee-fbb493146648/volumes" Jan 28 18:59:15 crc kubenswrapper[4749]: I0128 18:59:15.413144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:15 crc kubenswrapper[4749]: I0128 18:59:15.636071 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5789584db6-28j8v" Jan 28 18:59:15 crc kubenswrapper[4749]: I0128 18:59:15.735820 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8d585fd86-chhqf"] Jan 28 18:59:15 crc kubenswrapper[4749]: I0128 18:59:15.736511 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8d585fd86-chhqf" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerName="barbican-api-log" containerID="cri-o://df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368" gracePeriod=30 Jan 28 18:59:15 crc kubenswrapper[4749]: I0128 18:59:15.738033 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8d585fd86-chhqf" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerName="barbican-api" containerID="cri-o://269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71" gracePeriod=30 Jan 28 18:59:16 crc kubenswrapper[4749]: I0128 18:59:16.493187 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerID="df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368" exitCode=143 Jan 28 18:59:16 crc kubenswrapper[4749]: I0128 18:59:16.493493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8d585fd86-chhqf" event={"ID":"2ab55a87-e31f-4e2b-91e4-804392c3f90a","Type":"ContainerDied","Data":"df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368"} Jan 28 18:59:16 crc kubenswrapper[4749]: I0128 18:59:16.912530 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 18:59:16 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:59:16 crc kubenswrapper[4749]: > Jan 28 18:59:17 crc kubenswrapper[4749]: I0128 18:59:17.515182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dhqq7" event={"ID":"e9b37dae-71dc-4af8-9229-4f7124bcbb16","Type":"ContainerStarted","Data":"2923f29d0cd47940ac2825943fe1a3e3b9b5e3d05ed6e9bf51ed9dda4463b1e7"} Jan 28 18:59:17 crc kubenswrapper[4749]: I0128 18:59:17.545131 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dhqq7" podStartSLOduration=4.5739058660000005 podStartE2EDuration="1m43.545110637s" podCreationTimestamp="2026-01-28 18:57:34 +0000 UTC" firstStartedPulling="2026-01-28 18:57:36.645556431 +0000 UTC m=+1324.657083206" lastFinishedPulling="2026-01-28 18:59:15.616761202 +0000 UTC m=+1423.628287977" observedRunningTime="2026-01-28 18:59:17.542464222 +0000 UTC m=+1425.553991017" watchObservedRunningTime="2026-01-28 18:59:17.545110637 +0000 UTC m=+1425.556637412" Jan 28 18:59:18 crc kubenswrapper[4749]: I0128 18:59:18.529495 4749 generic.go:334] "Generic (PLEG): container finished" podID="58bcb1ff-0480-4503-ac70-ce54b6ab6a2d" containerID="5cd99b761b7c639919edb05e077be3b019720251d15ea49760004e79f83885e2" exitCode=0 Jan 28 18:59:18 crc kubenswrapper[4749]: I0128 18:59:18.529593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-69wsn" event={"ID":"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d","Type":"ContainerDied","Data":"5cd99b761b7c639919edb05e077be3b019720251d15ea49760004e79f83885e2"} Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.474737 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.541922 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerID="269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71" exitCode=0 Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.542132 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8d585fd86-chhqf" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.542478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8d585fd86-chhqf" event={"ID":"2ab55a87-e31f-4e2b-91e4-804392c3f90a","Type":"ContainerDied","Data":"269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71"} Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.542525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8d585fd86-chhqf" event={"ID":"2ab55a87-e31f-4e2b-91e4-804392c3f90a","Type":"ContainerDied","Data":"7b11ae04cc3c5322d2cc6e2a79d5e2f61e9c70d10b1334a72bb943a10bce064d"} Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.542547 4749 scope.go:117] "RemoveContainer" containerID="269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.575087 4749 scope.go:117] "RemoveContainer" containerID="df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.592472 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab55a87-e31f-4e2b-91e4-804392c3f90a-logs\") pod \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.592665 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data-custom\") pod \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.592791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbmdr\" (UniqueName: \"kubernetes.io/projected/2ab55a87-e31f-4e2b-91e4-804392c3f90a-kube-api-access-jbmdr\") pod \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.592827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data\") pod \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.592884 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-combined-ca-bundle\") pod \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\" (UID: \"2ab55a87-e31f-4e2b-91e4-804392c3f90a\") " Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.594828 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab55a87-e31f-4e2b-91e4-804392c3f90a-logs" (OuterVolumeSpecName: "logs") pod "2ab55a87-e31f-4e2b-91e4-804392c3f90a" (UID: "2ab55a87-e31f-4e2b-91e4-804392c3f90a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.599157 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab55a87-e31f-4e2b-91e4-804392c3f90a-kube-api-access-jbmdr" (OuterVolumeSpecName: "kube-api-access-jbmdr") pod "2ab55a87-e31f-4e2b-91e4-804392c3f90a" (UID: "2ab55a87-e31f-4e2b-91e4-804392c3f90a"). InnerVolumeSpecName "kube-api-access-jbmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.599392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2ab55a87-e31f-4e2b-91e4-804392c3f90a" (UID: "2ab55a87-e31f-4e2b-91e4-804392c3f90a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.601397 4749 scope.go:117] "RemoveContainer" containerID="269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71" Jan 28 18:59:19 crc kubenswrapper[4749]: E0128 18:59:19.602642 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71\": container with ID starting with 269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71 not found: ID does not exist" containerID="269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.602792 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71"} err="failed to get container status \"269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71\": rpc error: code = NotFound desc = could not find container \"269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71\": container with ID starting with 269c24651ad1b6ad2afeeddecf5abb6d614e95dc74fec614df05ff405a721f71 not found: ID does not exist" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.602818 4749 scope.go:117] "RemoveContainer" containerID="df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368" Jan 28 18:59:19 crc kubenswrapper[4749]: E0128 18:59:19.603523 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368\": container with ID starting with df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368 not found: ID does not exist" containerID="df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.603556 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368"} err="failed to get container status \"df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368\": rpc error: code = NotFound desc = could not find container \"df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368\": container with ID starting with df65909c7b66e65934ec84c8a6bc2c2e90d94bfdd7d9415ebe82f1cfe381a368 not found: ID does not exist" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.626808 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ab55a87-e31f-4e2b-91e4-804392c3f90a" (UID: "2ab55a87-e31f-4e2b-91e4-804392c3f90a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.653091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data" (OuterVolumeSpecName: "config-data") pod "2ab55a87-e31f-4e2b-91e4-804392c3f90a" (UID: "2ab55a87-e31f-4e2b-91e4-804392c3f90a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.696758 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab55a87-e31f-4e2b-91e4-804392c3f90a-logs\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.696797 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.696812 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbmdr\" (UniqueName: \"kubernetes.io/projected/2ab55a87-e31f-4e2b-91e4-804392c3f90a-kube-api-access-jbmdr\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.696823 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.696834 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab55a87-e31f-4e2b-91e4-804392c3f90a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.887040 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8d585fd86-chhqf"] Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.903188 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8d585fd86-chhqf"] Jan 28 18:59:19 crc kubenswrapper[4749]: I0128 18:59:19.978003 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-69wsn" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.105879 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-combined-ca-bundle\") pod \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.105957 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzrgk\" (UniqueName: \"kubernetes.io/projected/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-kube-api-access-zzrgk\") pod \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.106013 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-config\") pod \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\" (UID: \"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d\") " Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.112572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-kube-api-access-zzrgk" (OuterVolumeSpecName: "kube-api-access-zzrgk") pod "58bcb1ff-0480-4503-ac70-ce54b6ab6a2d" (UID: "58bcb1ff-0480-4503-ac70-ce54b6ab6a2d"). InnerVolumeSpecName "kube-api-access-zzrgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.136036 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-config" (OuterVolumeSpecName: "config") pod "58bcb1ff-0480-4503-ac70-ce54b6ab6a2d" (UID: "58bcb1ff-0480-4503-ac70-ce54b6ab6a2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.139432 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58bcb1ff-0480-4503-ac70-ce54b6ab6a2d" (UID: "58bcb1ff-0480-4503-ac70-ce54b6ab6a2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.208903 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.208939 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzrgk\" (UniqueName: \"kubernetes.io/projected/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-kube-api-access-zzrgk\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.208952 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.556009 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-69wsn" event={"ID":"58bcb1ff-0480-4503-ac70-ce54b6ab6a2d","Type":"ContainerDied","Data":"757a995c78231792b42e8d071b94e5e07e7807bc6629125adf5d8f198ffbe511"} Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.556053 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="757a995c78231792b42e8d071b94e5e07e7807bc6629125adf5d8f198ffbe511" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.556120 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-69wsn" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779095 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8pvvw"] Jan 28 18:59:20 crc kubenswrapper[4749]: E0128 18:59:20.779542 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275bf36a-db3c-45a5-8eee-fbb493146648" containerName="dnsmasq-dns" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779560 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="275bf36a-db3c-45a5-8eee-fbb493146648" containerName="dnsmasq-dns" Jan 28 18:59:20 crc kubenswrapper[4749]: E0128 18:59:20.779575 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bcb1ff-0480-4503-ac70-ce54b6ab6a2d" containerName="neutron-db-sync" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779584 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bcb1ff-0480-4503-ac70-ce54b6ab6a2d" containerName="neutron-db-sync" Jan 28 18:59:20 crc kubenswrapper[4749]: E0128 18:59:20.779602 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerName="barbican-api-log" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779608 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerName="barbican-api-log" Jan 28 18:59:20 crc kubenswrapper[4749]: E0128 18:59:20.779618 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275bf36a-db3c-45a5-8eee-fbb493146648" containerName="init" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779624 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="275bf36a-db3c-45a5-8eee-fbb493146648" containerName="init" Jan 28 18:59:20 crc kubenswrapper[4749]: E0128 18:59:20.779632 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerName="barbican-api" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779638 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerName="barbican-api" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779833 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="275bf36a-db3c-45a5-8eee-fbb493146648" containerName="dnsmasq-dns" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779853 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerName="barbican-api-log" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779863 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" containerName="barbican-api" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.779874 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="58bcb1ff-0480-4503-ac70-ce54b6ab6a2d" containerName="neutron-db-sync" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.780961 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.825063 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8pvvw"] Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.828165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.828284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.828420 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.828468 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-config\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.828507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdnrl\" (UniqueName: \"kubernetes.io/projected/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-kube-api-access-kdnrl\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.828532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.884764 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab55a87-e31f-4e2b-91e4-804392c3f90a" path="/var/lib/kubelet/pods/2ab55a87-e31f-4e2b-91e4-804392c3f90a/volumes" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.931239 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.931348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.931399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.931513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.931567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-config\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.931596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdnrl\" (UniqueName: \"kubernetes.io/projected/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-kube-api-access-kdnrl\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.933510 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67c94bd564-mjrws"] Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.935794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.936422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.936946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-config\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.939122 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.939150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.944509 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.947491 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.947740 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sr5z4" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.948069 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.954356 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.958629 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c94bd564-mjrws"] Jan 28 18:59:20 crc kubenswrapper[4749]: I0128 18:59:20.958972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdnrl\" (UniqueName: \"kubernetes.io/projected/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-kube-api-access-kdnrl\") pod \"dnsmasq-dns-85ff748b95-8pvvw\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.118595 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.138806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-combined-ca-bundle\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.139115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6jk\" (UniqueName: \"kubernetes.io/projected/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-kube-api-access-qz6jk\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.139144 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-config\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.139502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-httpd-config\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.139673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-ovndb-tls-certs\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.250036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-httpd-config\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.250159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-ovndb-tls-certs\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.250491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-combined-ca-bundle\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.250572 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6jk\" (UniqueName: \"kubernetes.io/projected/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-kube-api-access-qz6jk\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.250616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-config\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.273015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-ovndb-tls-certs\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.274898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-combined-ca-bundle\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.286802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6jk\" (UniqueName: \"kubernetes.io/projected/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-kube-api-access-qz6jk\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.302634 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-httpd-config\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.307806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-config\") pod \"neutron-67c94bd564-mjrws\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.343253 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:21 crc kubenswrapper[4749]: I0128 18:59:21.765554 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8pvvw"] Jan 28 18:59:22 crc kubenswrapper[4749]: I0128 18:59:22.275863 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-54bd464d95-gdqhz" Jan 28 18:59:22 crc kubenswrapper[4749]: I0128 18:59:22.280768 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c94bd564-mjrws"] Jan 28 18:59:22 crc kubenswrapper[4749]: I0128 18:59:22.632084 4749 generic.go:334] "Generic (PLEG): container finished" podID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerID="95b9ddf3e9000c8d606e68cde1c2e7a7bd3744c7be4d85bdd29628353f94d749" exitCode=0 Jan 28 18:59:22 crc kubenswrapper[4749]: I0128 18:59:22.632285 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" event={"ID":"67f487ff-4f6f-46e6-8271-2bd6666ab5c3","Type":"ContainerDied","Data":"95b9ddf3e9000c8d606e68cde1c2e7a7bd3744c7be4d85bdd29628353f94d749"} Jan 28 18:59:22 crc kubenswrapper[4749]: I0128 18:59:22.632401 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" event={"ID":"67f487ff-4f6f-46e6-8271-2bd6666ab5c3","Type":"ContainerStarted","Data":"582c6c4487f97479b18d44b45a32314ec930f85973ed9d452d3324b661f6e098"} Jan 28 18:59:22 crc kubenswrapper[4749]: I0128 18:59:22.634477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c94bd564-mjrws" event={"ID":"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc","Type":"ContainerStarted","Data":"f9aab673c56d11f6f91bb1de34d8eb0651d6371c92972576605608872ef30eb2"} Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.362591 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68fb6b79f7-lxql7"] Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.367478 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.370058 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.372545 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.376697 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68fb6b79f7-lxql7"] Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.421429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bctdq\" (UniqueName: \"kubernetes.io/projected/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-kube-api-access-bctdq\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.421644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-combined-ca-bundle\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.421691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-httpd-config\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.421750 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-config\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.421822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-ovndb-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.421845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-public-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.421870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-internal-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.524418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bctdq\" (UniqueName: \"kubernetes.io/projected/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-kube-api-access-bctdq\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.524661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-combined-ca-bundle\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.524709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-httpd-config\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.524776 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-config\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.524856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-ovndb-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.524888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-public-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.524927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-internal-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.533528 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-httpd-config\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.534300 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-internal-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.535049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-ovndb-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.535231 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-config\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.537678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-public-tls-certs\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.539065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-combined-ca-bundle\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.547619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bctdq\" (UniqueName: \"kubernetes.io/projected/5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7-kube-api-access-bctdq\") pod \"neutron-68fb6b79f7-lxql7\" (UID: \"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7\") " pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.675977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c94bd564-mjrws" event={"ID":"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc","Type":"ContainerStarted","Data":"076ee848881ff4c5f111de5abc89f0713e2834b80064fbd9e09e8fade6b522bc"} Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.676027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c94bd564-mjrws" event={"ID":"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc","Type":"ContainerStarted","Data":"917d85169a5b197c14265a97b8d1a2cbf02b19faa3f23321a8339221122881d0"} Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.676476 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.682597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" event={"ID":"67f487ff-4f6f-46e6-8271-2bd6666ab5c3","Type":"ContainerStarted","Data":"2ba8c605f84ee7fd33fe562a8d71da7392d435b5ebf8798fa6b3d2864adc9fb7"} Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.683056 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.698174 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.704121 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67c94bd564-mjrws" podStartSLOduration=3.704102963 podStartE2EDuration="3.704102963s" podCreationTimestamp="2026-01-28 18:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:23.692663938 +0000 UTC m=+1431.704190723" watchObservedRunningTime="2026-01-28 18:59:23.704102963 +0000 UTC m=+1431.715629738" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.749001 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" podStartSLOduration=3.748975429 podStartE2EDuration="3.748975429s" podCreationTimestamp="2026-01-28 18:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:23.717529007 +0000 UTC m=+1431.729055792" watchObservedRunningTime="2026-01-28 18:59:23.748975429 +0000 UTC m=+1431.760502204" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.843665 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.858469 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.864659 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-z7sfk" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.865067 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.865351 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.901737 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.952959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b70372a-0994-4bb9-8369-7b00699ee7c0-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.953009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b70372a-0994-4bb9-8369-7b00699ee7c0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.953169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklhd\" (UniqueName: \"kubernetes.io/projected/1b70372a-0994-4bb9-8369-7b00699ee7c0-kube-api-access-qklhd\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:23 crc kubenswrapper[4749]: I0128 18:59:23.953232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b70372a-0994-4bb9-8369-7b00699ee7c0-openstack-config\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.055780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklhd\" (UniqueName: \"kubernetes.io/projected/1b70372a-0994-4bb9-8369-7b00699ee7c0-kube-api-access-qklhd\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.055846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b70372a-0994-4bb9-8369-7b00699ee7c0-openstack-config\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.055938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b70372a-0994-4bb9-8369-7b00699ee7c0-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.055965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b70372a-0994-4bb9-8369-7b00699ee7c0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.057182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b70372a-0994-4bb9-8369-7b00699ee7c0-openstack-config\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.065462 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b70372a-0994-4bb9-8369-7b00699ee7c0-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.072964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b70372a-0994-4bb9-8369-7b00699ee7c0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.079123 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklhd\" (UniqueName: \"kubernetes.io/projected/1b70372a-0994-4bb9-8369-7b00699ee7c0-kube-api-access-qklhd\") pod \"openstackclient\" (UID: \"1b70372a-0994-4bb9-8369-7b00699ee7c0\") " pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.193989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.484751 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68fb6b79f7-lxql7"] Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.746555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68fb6b79f7-lxql7" event={"ID":"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7","Type":"ContainerStarted","Data":"07b4b630fae6188e7d6585b632bfc9ed0626e9b5b02477f4b4c008153b0de28f"} Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.758812 4749 generic.go:334] "Generic (PLEG): container finished" podID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" containerID="2923f29d0cd47940ac2825943fe1a3e3b9b5e3d05ed6e9bf51ed9dda4463b1e7" exitCode=0 Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.759828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dhqq7" event={"ID":"e9b37dae-71dc-4af8-9229-4f7124bcbb16","Type":"ContainerDied","Data":"2923f29d0cd47940ac2825943fe1a3e3b9b5e3d05ed6e9bf51ed9dda4463b1e7"} Jan 28 18:59:24 crc kubenswrapper[4749]: I0128 18:59:24.844554 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 28 18:59:25 crc kubenswrapper[4749]: I0128 18:59:25.780892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68fb6b79f7-lxql7" event={"ID":"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7","Type":"ContainerStarted","Data":"79bcca49fa53ac1956318705e1109005e462a45f814352cb8362cb163082b445"} Jan 28 18:59:25 crc kubenswrapper[4749]: I0128 18:59:25.781243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68fb6b79f7-lxql7" event={"ID":"5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7","Type":"ContainerStarted","Data":"e17367c0d35cf560c46c3361c0463c4f506f7bea7f5fab05f2ebc2bbd9a976e2"} Jan 28 18:59:25 crc kubenswrapper[4749]: I0128 18:59:25.781262 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:25 crc kubenswrapper[4749]: I0128 18:59:25.785998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1b70372a-0994-4bb9-8369-7b00699ee7c0","Type":"ContainerStarted","Data":"4e56cb7fa921bfe3bf3bbdd0a4de5dfa250520a300f65e3be1d82ede821d8faa"} Jan 28 18:59:25 crc kubenswrapper[4749]: I0128 18:59:25.805964 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68fb6b79f7-lxql7" podStartSLOduration=2.803802541 podStartE2EDuration="2.803802541s" podCreationTimestamp="2026-01-28 18:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:25.797230747 +0000 UTC m=+1433.808757522" watchObservedRunningTime="2026-01-28 18:59:25.803802541 +0000 UTC m=+1433.815329316" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.440099 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.524417 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-combined-ca-bundle\") pod \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.524462 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9b37dae-71dc-4af8-9229-4f7124bcbb16-etc-machine-id\") pod \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.524519 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-config-data\") pod \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.524562 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcfp9\" (UniqueName: \"kubernetes.io/projected/e9b37dae-71dc-4af8-9229-4f7124bcbb16-kube-api-access-dcfp9\") pod \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.524643 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-scripts\") pod \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.524843 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-db-sync-config-data\") pod \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\" (UID: \"e9b37dae-71dc-4af8-9229-4f7124bcbb16\") " Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.527528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9b37dae-71dc-4af8-9229-4f7124bcbb16-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e9b37dae-71dc-4af8-9229-4f7124bcbb16" (UID: "e9b37dae-71dc-4af8-9229-4f7124bcbb16"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.536032 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b37dae-71dc-4af8-9229-4f7124bcbb16-kube-api-access-dcfp9" (OuterVolumeSpecName: "kube-api-access-dcfp9") pod "e9b37dae-71dc-4af8-9229-4f7124bcbb16" (UID: "e9b37dae-71dc-4af8-9229-4f7124bcbb16"). InnerVolumeSpecName "kube-api-access-dcfp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.540125 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e9b37dae-71dc-4af8-9229-4f7124bcbb16" (UID: "e9b37dae-71dc-4af8-9229-4f7124bcbb16"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.562119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-scripts" (OuterVolumeSpecName: "scripts") pod "e9b37dae-71dc-4af8-9229-4f7124bcbb16" (UID: "e9b37dae-71dc-4af8-9229-4f7124bcbb16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.584510 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9b37dae-71dc-4af8-9229-4f7124bcbb16" (UID: "e9b37dae-71dc-4af8-9229-4f7124bcbb16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.628404 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.628482 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.628496 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.628506 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9b37dae-71dc-4af8-9229-4f7124bcbb16-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.628514 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcfp9\" (UniqueName: \"kubernetes.io/projected/e9b37dae-71dc-4af8-9229-4f7124bcbb16-kube-api-access-dcfp9\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.636592 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-config-data" (OuterVolumeSpecName: "config-data") pod "e9b37dae-71dc-4af8-9229-4f7124bcbb16" (UID: "e9b37dae-71dc-4af8-9229-4f7124bcbb16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.730565 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b37dae-71dc-4af8-9229-4f7124bcbb16-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.800393 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dhqq7" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.800414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dhqq7" event={"ID":"e9b37dae-71dc-4af8-9229-4f7124bcbb16","Type":"ContainerDied","Data":"d82082391df70777c1598902538b39b6b53cfb42d711e00645e992e40236f66c"} Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.800506 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82082391df70777c1598902538b39b6b53cfb42d711e00645e992e40236f66c" Jan 28 18:59:26 crc kubenswrapper[4749]: I0128 18:59:26.902440 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 18:59:26 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:59:26 crc kubenswrapper[4749]: > Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.063582 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:27 crc kubenswrapper[4749]: E0128 18:59:27.064094 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" containerName="cinder-db-sync" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.064116 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" containerName="cinder-db-sync" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.065446 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" containerName="cinder-db-sync" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.066674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.079116 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.079306 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.079508 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.079610 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nhvmb" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.088691 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.142508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.143162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61f9e37b-7e92-454e-8806-ab9bda2efbf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.143313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.143706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f842l\" (UniqueName: \"kubernetes.io/projected/61f9e37b-7e92-454e-8806-ab9bda2efbf7-kube-api-access-f842l\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.143870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.143962 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.237343 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.270408 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.270488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61f9e37b-7e92-454e-8806-ab9bda2efbf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.270549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.270723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f842l\" (UniqueName: \"kubernetes.io/projected/61f9e37b-7e92-454e-8806-ab9bda2efbf7-kube-api-access-f842l\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.270825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.270865 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.280433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61f9e37b-7e92-454e-8806-ab9bda2efbf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.296052 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.306263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.306857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.321855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.322893 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f842l\" (UniqueName: \"kubernetes.io/projected/61f9e37b-7e92-454e-8806-ab9bda2efbf7-kube-api-access-f842l\") pod \"cinder-scheduler-0\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.332831 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8pvvw"] Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.333701 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" podUID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerName="dnsmasq-dns" containerID="cri-o://2ba8c605f84ee7fd33fe562a8d71da7392d435b5ebf8798fa6b3d2864adc9fb7" gracePeriod=10 Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.374635 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" podUID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.199:5353: connect: connection reset by peer" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.397082 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jntgh"] Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.398919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.404047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.406704 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jntgh"] Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.458176 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.484452 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.494809 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.502009 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.503718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.503802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.503866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7kh\" (UniqueName: \"kubernetes.io/projected/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-kube-api-access-9c7kh\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.503911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.503948 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-config\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.503978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233307a3-83e8-4fa9-9ba2-e2863125d6d3-logs\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-config\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607741 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/233307a3-83e8-4fa9-9ba2-e2863125d6d3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607839 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-scripts\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.607973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data-custom\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.608014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.608054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.608129 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7kh\" (UniqueName: \"kubernetes.io/projected/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-kube-api-access-9c7kh\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.608148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj8lr\" (UniqueName: \"kubernetes.io/projected/233307a3-83e8-4fa9-9ba2-e2863125d6d3-kube-api-access-lj8lr\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.609273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.610132 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-config\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.610232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.610417 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.610658 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.645908 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7699ff9496-twrqx" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.724168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.724664 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj8lr\" (UniqueName: \"kubernetes.io/projected/233307a3-83e8-4fa9-9ba2-e2863125d6d3-kube-api-access-lj8lr\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.724730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233307a3-83e8-4fa9-9ba2-e2863125d6d3-logs\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.724907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/233307a3-83e8-4fa9-9ba2-e2863125d6d3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.724951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.724977 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-scripts\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.725093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data-custom\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.726549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/233307a3-83e8-4fa9-9ba2-e2863125d6d3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.728146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233307a3-83e8-4fa9-9ba2-e2863125d6d3-logs\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.729065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7kh\" (UniqueName: \"kubernetes.io/projected/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-kube-api-access-9c7kh\") pod \"dnsmasq-dns-5c9776ccc5-jntgh\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.737502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.738357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data-custom\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.740263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.752235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-scripts\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.766518 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj8lr\" (UniqueName: \"kubernetes.io/projected/233307a3-83e8-4fa9-9ba2-e2863125d6d3-kube-api-access-lj8lr\") pod \"cinder-api-0\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " pod="openstack/cinder-api-0" Jan 28 18:59:27 crc kubenswrapper[4749]: E0128 18:59:27.805693 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67f487ff_4f6f_46e6_8271_2bd6666ab5c3.slice/crio-2ba8c605f84ee7fd33fe562a8d71da7392d435b5ebf8798fa6b3d2864adc9fb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67f487ff_4f6f_46e6_8271_2bd6666ab5c3.slice/crio-conmon-2ba8c605f84ee7fd33fe562a8d71da7392d435b5ebf8798fa6b3d2864adc9fb7.scope\": RecentStats: unable to find data in memory cache]" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.838565 4749 generic.go:334] "Generic (PLEG): container finished" podID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerID="2ba8c605f84ee7fd33fe562a8d71da7392d435b5ebf8798fa6b3d2864adc9fb7" exitCode=0 Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.848358 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" event={"ID":"67f487ff-4f6f-46e6-8271-2bd6666ab5c3","Type":"ContainerDied","Data":"2ba8c605f84ee7fd33fe562a8d71da7392d435b5ebf8798fa6b3d2864adc9fb7"} Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.923282 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:27 crc kubenswrapper[4749]: I0128 18:59:27.942740 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.448564 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.582052 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-swift-storage-0\") pod \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.582457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-nb\") pod \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.582549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-sb\") pod \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.582579 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-config\") pod \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.582788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-svc\") pod \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.582860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdnrl\" (UniqueName: \"kubernetes.io/projected/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-kube-api-access-kdnrl\") pod \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\" (UID: \"67f487ff-4f6f-46e6-8271-2bd6666ab5c3\") " Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.594535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-kube-api-access-kdnrl" (OuterVolumeSpecName: "kube-api-access-kdnrl") pod "67f487ff-4f6f-46e6-8271-2bd6666ab5c3" (UID: "67f487ff-4f6f-46e6-8271-2bd6666ab5c3"). InnerVolumeSpecName "kube-api-access-kdnrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.684219 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "67f487ff-4f6f-46e6-8271-2bd6666ab5c3" (UID: "67f487ff-4f6f-46e6-8271-2bd6666ab5c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.685577 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.685688 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdnrl\" (UniqueName: \"kubernetes.io/projected/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-kube-api-access-kdnrl\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.694337 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.738641 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67f487ff-4f6f-46e6-8271-2bd6666ab5c3" (UID: "67f487ff-4f6f-46e6-8271-2bd6666ab5c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.750967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67f487ff-4f6f-46e6-8271-2bd6666ab5c3" (UID: "67f487ff-4f6f-46e6-8271-2bd6666ab5c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.761070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67f487ff-4f6f-46e6-8271-2bd6666ab5c3" (UID: "67f487ff-4f6f-46e6-8271-2bd6666ab5c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.789223 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.789257 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.789266 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.875225 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-config" (OuterVolumeSpecName: "config") pod "67f487ff-4f6f-46e6-8271-2bd6666ab5c3" (UID: "67f487ff-4f6f-46e6-8271-2bd6666ab5c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.877351 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.892888 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f487ff-4f6f-46e6-8271-2bd6666ab5c3-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.911429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8pvvw" event={"ID":"67f487ff-4f6f-46e6-8271-2bd6666ab5c3","Type":"ContainerDied","Data":"582c6c4487f97479b18d44b45a32314ec930f85973ed9d452d3324b661f6e098"} Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.911467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"61f9e37b-7e92-454e-8806-ab9bda2efbf7","Type":"ContainerStarted","Data":"6caf918f72f8b9dfaca988b83256e846afd8601487b781783c660cdd29542423"} Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.911485 4749 scope.go:117] "RemoveContainer" containerID="2ba8c605f84ee7fd33fe562a8d71da7392d435b5ebf8798fa6b3d2864adc9fb7" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.976028 4749 scope.go:117] "RemoveContainer" containerID="95b9ddf3e9000c8d606e68cde1c2e7a7bd3744c7be4d85bdd29628353f94d749" Jan 28 18:59:28 crc kubenswrapper[4749]: I0128 18:59:28.978357 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8pvvw"] Jan 28 18:59:29 crc kubenswrapper[4749]: I0128 18:59:29.024670 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8pvvw"] Jan 28 18:59:29 crc kubenswrapper[4749]: I0128 18:59:29.087420 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jntgh"] Jan 28 18:59:29 crc kubenswrapper[4749]: W0128 18:59:29.114855 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ef74f3e_9703_4dfd_ae3b_c3d1112f2dd5.slice/crio-91a28aaf49659a8f2b99c795cce894b0bfa98c273bdcaba0fd53ec129e75ae42 WatchSource:0}: Error finding container 91a28aaf49659a8f2b99c795cce894b0bfa98c273bdcaba0fd53ec129e75ae42: Status 404 returned error can't find the container with id 91a28aaf49659a8f2b99c795cce894b0bfa98c273bdcaba0fd53ec129e75ae42 Jan 28 18:59:29 crc kubenswrapper[4749]: I0128 18:59:29.176582 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:29 crc kubenswrapper[4749]: I0128 18:59:29.929422 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" containerID="5d3f64f2057aec6c03e0c59e1fc8c23b0998f06a8e264867622d4687cbfb4a79" exitCode=0 Jan 28 18:59:29 crc kubenswrapper[4749]: I0128 18:59:29.930490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" event={"ID":"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5","Type":"ContainerDied","Data":"5d3f64f2057aec6c03e0c59e1fc8c23b0998f06a8e264867622d4687cbfb4a79"} Jan 28 18:59:29 crc kubenswrapper[4749]: I0128 18:59:29.930554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" event={"ID":"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5","Type":"ContainerStarted","Data":"91a28aaf49659a8f2b99c795cce894b0bfa98c273bdcaba0fd53ec129e75ae42"} Jan 28 18:59:29 crc kubenswrapper[4749]: I0128 18:59:29.938600 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"233307a3-83e8-4fa9-9ba2-e2863125d6d3","Type":"ContainerStarted","Data":"d3f0328027f877600beb68430839b5ad883d2e041fdc6f28a5f81da69451a2d0"} Jan 28 18:59:30 crc kubenswrapper[4749]: I0128 18:59:30.241173 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:30 crc kubenswrapper[4749]: I0128 18:59:30.918956 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" path="/var/lib/kubelet/pods/67f487ff-4f6f-46e6-8271-2bd6666ab5c3/volumes" Jan 28 18:59:30 crc kubenswrapper[4749]: I0128 18:59:30.988817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" event={"ID":"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5","Type":"ContainerStarted","Data":"f012db1194f330cd2ffc16f8517203f501799877f708434404675427bda5180e"} Jan 28 18:59:30 crc kubenswrapper[4749]: I0128 18:59:30.989250 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:31 crc kubenswrapper[4749]: I0128 18:59:31.006554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"233307a3-83e8-4fa9-9ba2-e2863125d6d3","Type":"ContainerStarted","Data":"610067b67cd8eab7f272e0bd9e9c2eda76e6f323cc490f67989f420880add65f"} Jan 28 18:59:31 crc kubenswrapper[4749]: I0128 18:59:31.052866 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" podStartSLOduration=4.052842927 podStartE2EDuration="4.052842927s" podCreationTimestamp="2026-01-28 18:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:31.020931322 +0000 UTC m=+1439.032458137" watchObservedRunningTime="2026-01-28 18:59:31.052842927 +0000 UTC m=+1439.064369702" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.041620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"233307a3-83e8-4fa9-9ba2-e2863125d6d3","Type":"ContainerStarted","Data":"d83eba2704e71df593a5ab1ed348f85f375a7c35e67e1ea5e99dd2f1c5df9a67"} Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.042262 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerName="cinder-api-log" containerID="cri-o://610067b67cd8eab7f272e0bd9e9c2eda76e6f323cc490f67989f420880add65f" gracePeriod=30 Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.042545 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.042918 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerName="cinder-api" containerID="cri-o://d83eba2704e71df593a5ab1ed348f85f375a7c35e67e1ea5e99dd2f1c5df9a67" gracePeriod=30 Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.057052 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.068720 4749 generic.go:334] "Generic (PLEG): container finished" podID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerID="f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25" exitCode=137 Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.068814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9177835f-67ce-45ec-9c8c-843e54b25deb","Type":"ContainerDied","Data":"f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25"} Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.068844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9177835f-67ce-45ec-9c8c-843e54b25deb","Type":"ContainerDied","Data":"af7272dfb1b09d8cb4d03e6fb1487b3c3c9ea12beb53e859fa93ce2086bf7c0c"} Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.068863 4749 scope.go:117] "RemoveContainer" containerID="f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.078401 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.078380275 podStartE2EDuration="5.078380275s" podCreationTimestamp="2026-01-28 18:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:32.061741041 +0000 UTC m=+1440.073267816" watchObservedRunningTime="2026-01-28 18:59:32.078380275 +0000 UTC m=+1440.089907070" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.105464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"61f9e37b-7e92-454e-8806-ab9bda2efbf7","Type":"ContainerStarted","Data":"d91a884ea3421785e3434792b37594dbbfc17a0d60eb9fae3f460c471d0eb3ec"} Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.106926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-config-data\") pod \"9177835f-67ce-45ec-9c8c-843e54b25deb\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.107048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-sg-core-conf-yaml\") pod \"9177835f-67ce-45ec-9c8c-843e54b25deb\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.107095 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-log-httpd\") pod \"9177835f-67ce-45ec-9c8c-843e54b25deb\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.107184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-combined-ca-bundle\") pod \"9177835f-67ce-45ec-9c8c-843e54b25deb\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.107211 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njttk\" (UniqueName: \"kubernetes.io/projected/9177835f-67ce-45ec-9c8c-843e54b25deb-kube-api-access-njttk\") pod \"9177835f-67ce-45ec-9c8c-843e54b25deb\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.107242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-run-httpd\") pod \"9177835f-67ce-45ec-9c8c-843e54b25deb\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.107276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-scripts\") pod \"9177835f-67ce-45ec-9c8c-843e54b25deb\" (UID: \"9177835f-67ce-45ec-9c8c-843e54b25deb\") " Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.111905 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9177835f-67ce-45ec-9c8c-843e54b25deb" (UID: "9177835f-67ce-45ec-9c8c-843e54b25deb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.113554 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-scripts" (OuterVolumeSpecName: "scripts") pod "9177835f-67ce-45ec-9c8c-843e54b25deb" (UID: "9177835f-67ce-45ec-9c8c-843e54b25deb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.113975 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9177835f-67ce-45ec-9c8c-843e54b25deb" (UID: "9177835f-67ce-45ec-9c8c-843e54b25deb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.116682 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9177835f-67ce-45ec-9c8c-843e54b25deb-kube-api-access-njttk" (OuterVolumeSpecName: "kube-api-access-njttk") pod "9177835f-67ce-45ec-9c8c-843e54b25deb" (UID: "9177835f-67ce-45ec-9c8c-843e54b25deb"). InnerVolumeSpecName "kube-api-access-njttk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.117867 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9177835f-67ce-45ec-9c8c-843e54b25deb" (UID: "9177835f-67ce-45ec-9c8c-843e54b25deb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.142939 4749 scope.go:117] "RemoveContainer" containerID="2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.218154 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.218183 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.218193 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njttk\" (UniqueName: \"kubernetes.io/projected/9177835f-67ce-45ec-9c8c-843e54b25deb-kube-api-access-njttk\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.218202 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9177835f-67ce-45ec-9c8c-843e54b25deb-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.218210 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.360215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9177835f-67ce-45ec-9c8c-843e54b25deb" (UID: "9177835f-67ce-45ec-9c8c-843e54b25deb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.436229 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.458087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-config-data" (OuterVolumeSpecName: "config-data") pod "9177835f-67ce-45ec-9c8c-843e54b25deb" (UID: "9177835f-67ce-45ec-9c8c-843e54b25deb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.537901 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9177835f-67ce-45ec-9c8c-843e54b25deb-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.690506 4749 scope.go:117] "RemoveContainer" containerID="8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.744811 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-568fc79444-j9z48"] Jan 28 18:59:32 crc kubenswrapper[4749]: E0128 18:59:32.745237 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="ceilometer-notification-agent" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745256 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="ceilometer-notification-agent" Jan 28 18:59:32 crc kubenswrapper[4749]: E0128 18:59:32.745271 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="proxy-httpd" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745277 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="proxy-httpd" Jan 28 18:59:32 crc kubenswrapper[4749]: E0128 18:59:32.745287 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerName="dnsmasq-dns" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745294 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerName="dnsmasq-dns" Jan 28 18:59:32 crc kubenswrapper[4749]: E0128 18:59:32.745305 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerName="init" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745311 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerName="init" Jan 28 18:59:32 crc kubenswrapper[4749]: E0128 18:59:32.745409 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="ceilometer-central-agent" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745416 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="ceilometer-central-agent" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745624 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="ceilometer-notification-agent" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745643 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="ceilometer-central-agent" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745660 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f487ff-4f6f-46e6-8271-2bd6666ab5c3" containerName="dnsmasq-dns" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.745669 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" containerName="proxy-httpd" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.746414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.801515 4749 scope.go:117] "RemoveContainer" containerID="f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25" Jan 28 18:59:32 crc kubenswrapper[4749]: E0128 18:59:32.809816 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25\": container with ID starting with f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25 not found: ID does not exist" containerID="f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.809876 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25"} err="failed to get container status \"f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25\": rpc error: code = NotFound desc = could not find container \"f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25\": container with ID starting with f4cebb3f74c16f7e9786baf89cbb77225ecd0571c6991b79be3f2774ddd33e25 not found: ID does not exist" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.809972 4749 scope.go:117] "RemoveContainer" containerID="2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.812936 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-jpk7v" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.818148 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Jan 28 18:59:32 crc kubenswrapper[4749]: E0128 18:59:32.820251 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb\": container with ID starting with 2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb not found: ID does not exist" containerID="2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.820304 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb"} err="failed to get container status \"2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb\": rpc error: code = NotFound desc = could not find container \"2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb\": container with ID starting with 2be623f51f6a09780901a756be8914105aaa6627e073fa9e169f91b508780acb not found: ID does not exist" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.820342 4749 scope.go:117] "RemoveContainer" containerID="8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b" Jan 28 18:59:32 crc kubenswrapper[4749]: E0128 18:59:32.822614 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b\": container with ID starting with 8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b not found: ID does not exist" containerID="8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.822654 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b"} err="failed to get container status \"8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b\": rpc error: code = NotFound desc = could not find container \"8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b\": container with ID starting with 8165af133bd4c7e2022c8c82134204eafac94fc8e79fe3db5f96eb8605f90d4b not found: ID does not exist" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.828809 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.863272 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.863377 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5xx\" (UniqueName: \"kubernetes.io/projected/47a65467-bac6-4a3c-b634-a3cbe9d282f3-kube-api-access-7z5xx\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.863504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data-custom\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.863584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-combined-ca-bundle\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.981007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.981338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5xx\" (UniqueName: \"kubernetes.io/projected/47a65467-bac6-4a3c-b634-a3cbe9d282f3-kube-api-access-7z5xx\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.981432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data-custom\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:32 crc kubenswrapper[4749]: I0128 18:59:32.981495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-combined-ca-bundle\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.006491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.008503 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.017019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-568fc79444-j9z48"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.017076 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jntgh"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.017098 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-6xh7w"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.017687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-combined-ca-bundle\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.039316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5xx\" (UniqueName: \"kubernetes.io/projected/47a65467-bac6-4a3c-b634-a3cbe9d282f3-kube-api-access-7z5xx\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.039813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.040226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data-custom\") pod \"heat-engine-568fc79444-j9z48\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.041532 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-6xh7w"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.073363 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-66cbb7db47-sh8m4"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.077184 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.085716 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.134706 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-66cbb7db47-sh8m4"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.150763 4749 generic.go:334] "Generic (PLEG): container finished" podID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerID="d83eba2704e71df593a5ab1ed348f85f375a7c35e67e1ea5e99dd2f1c5df9a67" exitCode=0 Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.150801 4749 generic.go:334] "Generic (PLEG): container finished" podID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerID="610067b67cd8eab7f272e0bd9e9c2eda76e6f323cc490f67989f420880add65f" exitCode=143 Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.150888 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"233307a3-83e8-4fa9-9ba2-e2863125d6d3","Type":"ContainerDied","Data":"d83eba2704e71df593a5ab1ed348f85f375a7c35e67e1ea5e99dd2f1c5df9a67"} Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.150919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"233307a3-83e8-4fa9-9ba2-e2863125d6d3","Type":"ContainerDied","Data":"610067b67cd8eab7f272e0bd9e9c2eda76e6f323cc490f67989f420880add65f"} Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.153728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"61f9e37b-7e92-454e-8806-ab9bda2efbf7","Type":"ContainerStarted","Data":"d0cc3923523498884708fa069a1e51907aff2b85fa0b052a7343b1960f00cb3a"} Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.170983 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" podUID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" containerName="dnsmasq-dns" containerID="cri-o://f012db1194f330cd2ffc16f8517203f501799877f708434404675427bda5180e" gracePeriod=10 Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.171273 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.173320 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-9dfc74cb7-xtw7l"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.174378 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-jpk7v" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.180649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.182040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.185794 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.191532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cl5r\" (UniqueName: \"kubernetes.io/projected/f9ffa10e-04c0-4b2f-91ce-6614d405034a-kube-api-access-4cl5r\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.191580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.191628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-config\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.191676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.191705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.191780 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.191815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.194458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-combined-ca-bundle\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.197545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data-custom\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.197633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhc9l\" (UniqueName: \"kubernetes.io/projected/3ff11c89-e09d-40a3-ac11-4eece14288c9-kube-api-access-lhc9l\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.201474 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9dfc74cb7-xtw7l"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data-custom\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301391 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-combined-ca-bundle\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhc9l\" (UniqueName: \"kubernetes.io/projected/3ff11c89-e09d-40a3-ac11-4eece14288c9-kube-api-access-lhc9l\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cl5r\" (UniqueName: \"kubernetes.io/projected/f9ffa10e-04c0-4b2f-91ce-6614d405034a-kube-api-access-4cl5r\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-config\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.301630 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.302199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.302258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27m79\" (UniqueName: \"kubernetes.io/projected/be080cb1-98cd-4790-8132-52818502d1a0-kube-api-access-27m79\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.302353 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.302433 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-combined-ca-bundle\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.302469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data-custom\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.303578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.305184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.305358 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.305799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.307079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.307303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-config\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.310014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data-custom\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.314792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-combined-ca-bundle\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.319357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.339256 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cl5r\" (UniqueName: \"kubernetes.io/projected/f9ffa10e-04c0-4b2f-91ce-6614d405034a-kube-api-access-4cl5r\") pod \"dnsmasq-dns-7756b9d78c-6xh7w\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.351429 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhc9l\" (UniqueName: \"kubernetes.io/projected/3ff11c89-e09d-40a3-ac11-4eece14288c9-kube-api-access-lhc9l\") pod \"heat-cfnapi-66cbb7db47-sh8m4\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.399292 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.527012328 podStartE2EDuration="6.399268763s" podCreationTimestamp="2026-01-28 18:59:27 +0000 UTC" firstStartedPulling="2026-01-28 18:59:28.7382499 +0000 UTC m=+1436.749776675" lastFinishedPulling="2026-01-28 18:59:29.610506335 +0000 UTC m=+1437.622033110" observedRunningTime="2026-01-28 18:59:33.182631953 +0000 UTC m=+1441.194158748" watchObservedRunningTime="2026-01-28 18:59:33.399268763 +0000 UTC m=+1441.410795548" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405096 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/233307a3-83e8-4fa9-9ba2-e2863125d6d3-etc-machine-id\") pod \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405150 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-combined-ca-bundle\") pod \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405174 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233307a3-83e8-4fa9-9ba2-e2863125d6d3-logs\") pod \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405194 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-scripts\") pod \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj8lr\" (UniqueName: \"kubernetes.io/projected/233307a3-83e8-4fa9-9ba2-e2863125d6d3-kube-api-access-lj8lr\") pod \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405366 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data-custom\") pod \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405492 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data\") pod \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\" (UID: \"233307a3-83e8-4fa9-9ba2-e2863125d6d3\") " Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data-custom\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-combined-ca-bundle\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27m79\" (UniqueName: \"kubernetes.io/projected/be080cb1-98cd-4790-8132-52818502d1a0-kube-api-access-27m79\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.405992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.413989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233307a3-83e8-4fa9-9ba2-e2863125d6d3-logs" (OuterVolumeSpecName: "logs") pod "233307a3-83e8-4fa9-9ba2-e2863125d6d3" (UID: "233307a3-83e8-4fa9-9ba2-e2863125d6d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.415769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.420040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-scripts" (OuterVolumeSpecName: "scripts") pod "233307a3-83e8-4fa9-9ba2-e2863125d6d3" (UID: "233307a3-83e8-4fa9-9ba2-e2863125d6d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.420111 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/233307a3-83e8-4fa9-9ba2-e2863125d6d3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "233307a3-83e8-4fa9-9ba2-e2863125d6d3" (UID: "233307a3-83e8-4fa9-9ba2-e2863125d6d3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.421463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "233307a3-83e8-4fa9-9ba2-e2863125d6d3" (UID: "233307a3-83e8-4fa9-9ba2-e2863125d6d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.421821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-combined-ca-bundle\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.423035 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233307a3-83e8-4fa9-9ba2-e2863125d6d3-kube-api-access-lj8lr" (OuterVolumeSpecName: "kube-api-access-lj8lr") pod "233307a3-83e8-4fa9-9ba2-e2863125d6d3" (UID: "233307a3-83e8-4fa9-9ba2-e2863125d6d3"). InnerVolumeSpecName "kube-api-access-lj8lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.441387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data-custom\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.448369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.467922 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.490921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27m79\" (UniqueName: \"kubernetes.io/projected/be080cb1-98cd-4790-8132-52818502d1a0-kube-api-access-27m79\") pod \"heat-api-9dfc74cb7-xtw7l\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.513310 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/233307a3-83e8-4fa9-9ba2-e2863125d6d3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.513374 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233307a3-83e8-4fa9-9ba2-e2863125d6d3-logs\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.513390 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.513404 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj8lr\" (UniqueName: \"kubernetes.io/projected/233307a3-83e8-4fa9-9ba2-e2863125d6d3-kube-api-access-lj8lr\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.513965 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.518046 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.524669 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.589839 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.629460 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:59:33 crc kubenswrapper[4749]: E0128 18:59:33.630039 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerName="cinder-api-log" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.630060 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerName="cinder-api-log" Jan 28 18:59:33 crc kubenswrapper[4749]: E0128 18:59:33.630078 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerName="cinder-api" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.630099 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerName="cinder-api" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.630359 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerName="cinder-api-log" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.630379 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" containerName="cinder-api" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.632358 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.647144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "233307a3-83e8-4fa9-9ba2-e2863125d6d3" (UID: "233307a3-83e8-4fa9-9ba2-e2863125d6d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.647806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.648881 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.728849 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-scripts\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.728902 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-config-data\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.728968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.728988 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-log-httpd\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.729050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-run-httpd\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.729070 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l25dp\" (UniqueName: \"kubernetes.io/projected/26a1df38-b5be-4122-a907-fadff9ea2487-kube-api-access-l25dp\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.729101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.729190 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.741122 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.831958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-run-httpd\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.832372 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l25dp\" (UniqueName: \"kubernetes.io/projected/26a1df38-b5be-4122-a907-fadff9ea2487-kube-api-access-l25dp\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.832517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.832744 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-scripts\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.832917 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-config-data\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.833161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.833263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-log-httpd\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.834697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-log-httpd\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.835148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-run-httpd\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.842612 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-scripts\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.846009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-config-data\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.864268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.869419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.879015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l25dp\" (UniqueName: \"kubernetes.io/projected/26a1df38-b5be-4122-a907-fadff9ea2487-kube-api-access-l25dp\") pod \"ceilometer-0\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " pod="openstack/ceilometer-0" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.903125 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data" (OuterVolumeSpecName: "config-data") pod "233307a3-83e8-4fa9-9ba2-e2863125d6d3" (UID: "233307a3-83e8-4fa9-9ba2-e2863125d6d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.935536 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233307a3-83e8-4fa9-9ba2-e2863125d6d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:33 crc kubenswrapper[4749]: I0128 18:59:33.986741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.286339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"233307a3-83e8-4fa9-9ba2-e2863125d6d3","Type":"ContainerDied","Data":"d3f0328027f877600beb68430839b5ad883d2e041fdc6f28a5f81da69451a2d0"} Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.286754 4749 scope.go:117] "RemoveContainer" containerID="d83eba2704e71df593a5ab1ed348f85f375a7c35e67e1ea5e99dd2f1c5df9a67" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.286876 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.315366 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" containerID="f012db1194f330cd2ffc16f8517203f501799877f708434404675427bda5180e" exitCode=0 Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.315556 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" event={"ID":"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5","Type":"ContainerDied","Data":"f012db1194f330cd2ffc16f8517203f501799877f708434404675427bda5180e"} Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.412190 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.478653 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.486150 4749 scope.go:117] "RemoveContainer" containerID="610067b67cd8eab7f272e0bd9e9c2eda76e6f323cc490f67989f420880add65f" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.518809 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.521195 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.532819 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.533035 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.533367 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.533646 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.538462 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.620450 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-svc\") pod \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.620826 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-config\") pod \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.620951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-swift-storage-0\") pod \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621064 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-sb\") pod \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-nb\") pod \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c7kh\" (UniqueName: \"kubernetes.io/projected/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-kube-api-access-9c7kh\") pod \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\" (UID: \"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5\") " Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621466 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cx9j\" (UniqueName: \"kubernetes.io/projected/c145454f-dbd9-46ff-b588-6c86198dc2e6-kube-api-access-7cx9j\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-scripts\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621596 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c145454f-dbd9-46ff-b588-6c86198dc2e6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621657 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c145454f-dbd9-46ff-b588-6c86198dc2e6-logs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-config-data\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621782 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-config-data-custom\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.621891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.641838 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-kube-api-access-9c7kh" (OuterVolumeSpecName: "kube-api-access-9c7kh") pod "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" (UID: "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5"). InnerVolumeSpecName "kube-api-access-9c7kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724122 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-config-data-custom\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cx9j\" (UniqueName: \"kubernetes.io/projected/c145454f-dbd9-46ff-b588-6c86198dc2e6-kube-api-access-7cx9j\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724310 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-scripts\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c145454f-dbd9-46ff-b588-6c86198dc2e6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c145454f-dbd9-46ff-b588-6c86198dc2e6-logs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-config-data\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.724563 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c7kh\" (UniqueName: \"kubernetes.io/projected/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-kube-api-access-9c7kh\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.727399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c145454f-dbd9-46ff-b588-6c86198dc2e6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.728556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c145454f-dbd9-46ff-b588-6c86198dc2e6-logs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.729415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-config-data\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.731809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-config-data-custom\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.735499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.740875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.742313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-scripts\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.773485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c145454f-dbd9-46ff-b588-6c86198dc2e6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.786041 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cx9j\" (UniqueName: \"kubernetes.io/projected/c145454f-dbd9-46ff-b588-6c86198dc2e6-kube-api-access-7cx9j\") pod \"cinder-api-0\" (UID: \"c145454f-dbd9-46ff-b588-6c86198dc2e6\") " pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.800343 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-config" (OuterVolumeSpecName: "config") pod "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" (UID: "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.814801 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" (UID: "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.827264 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.827659 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.844258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" (UID: "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.850029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" (UID: "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.860915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.935922 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233307a3-83e8-4fa9-9ba2-e2863125d6d3" path="/var/lib/kubelet/pods/233307a3-83e8-4fa9-9ba2-e2863125d6d3/volumes" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.936773 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9177835f-67ce-45ec-9c8c-843e54b25deb" path="/var/lib/kubelet/pods/9177835f-67ce-45ec-9c8c-843e54b25deb/volumes" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.941901 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.941939 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.954607 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" (UID: "5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:34 crc kubenswrapper[4749]: I0128 18:59:34.982017 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-568fc79444-j9z48"] Jan 28 18:59:34 crc kubenswrapper[4749]: W0128 18:59:34.990489 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ff11c89_e09d_40a3_ac11_4eece14288c9.slice/crio-48e815ac6521b4271d5d6dedebdb37a6fffa4f109be87dff46bc59f1757ecd15 WatchSource:0}: Error finding container 48e815ac6521b4271d5d6dedebdb37a6fffa4f109be87dff46bc59f1757ecd15: Status 404 returned error can't find the container with id 48e815ac6521b4271d5d6dedebdb37a6fffa4f109be87dff46bc59f1757ecd15 Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.037851 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-66cbb7db47-sh8m4"] Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.054882 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.123484 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9dfc74cb7-xtw7l"] Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.169036 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-6xh7w"] Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.211479 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.339324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerStarted","Data":"b43583f2792bc6eb994e377cff59cfeea2b5b9c70b0de3a462e37662114ecacb"} Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.352883 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" event={"ID":"3ff11c89-e09d-40a3-ac11-4eece14288c9","Type":"ContainerStarted","Data":"48e815ac6521b4271d5d6dedebdb37a6fffa4f109be87dff46bc59f1757ecd15"} Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.357077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" event={"ID":"f9ffa10e-04c0-4b2f-91ce-6614d405034a","Type":"ContainerStarted","Data":"d9f199129eaadcbea393fb11f82c51995366a8f5479cfe1f3f0d0be8f07ad82e"} Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.370756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-568fc79444-j9z48" event={"ID":"47a65467-bac6-4a3c-b634-a3cbe9d282f3","Type":"ContainerStarted","Data":"7966ac58ea5be4cd95b529000768cfc73c71a776135ce122580d4ff253b068bb"} Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.373235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9dfc74cb7-xtw7l" event={"ID":"be080cb1-98cd-4790-8132-52818502d1a0","Type":"ContainerStarted","Data":"685518c8221fb62585c1bff29ae716bf513e7339e6cc4046deeb647f748d79db"} Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.384077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" event={"ID":"5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5","Type":"ContainerDied","Data":"91a28aaf49659a8f2b99c795cce894b0bfa98c273bdcaba0fd53ec129e75ae42"} Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.384130 4749 scope.go:117] "RemoveContainer" containerID="f012db1194f330cd2ffc16f8517203f501799877f708434404675427bda5180e" Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.384245 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jntgh" Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.479809 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jntgh"] Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.484554 4749 scope.go:117] "RemoveContainer" containerID="5d3f64f2057aec6c03e0c59e1fc8c23b0998f06a8e264867622d4687cbfb4a79" Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.496443 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jntgh"] Jan 28 18:59:35 crc kubenswrapper[4749]: I0128 18:59:35.516557 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 18:59:36 crc kubenswrapper[4749]: I0128 18:59:36.461796 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerID="bc8606ef844ee2362c996468fd9e09f9cff780b5d59867e6360add18cd1a587a" exitCode=0 Jan 28 18:59:36 crc kubenswrapper[4749]: I0128 18:59:36.463616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" event={"ID":"f9ffa10e-04c0-4b2f-91ce-6614d405034a","Type":"ContainerDied","Data":"bc8606ef844ee2362c996468fd9e09f9cff780b5d59867e6360add18cd1a587a"} Jan 28 18:59:36 crc kubenswrapper[4749]: I0128 18:59:36.474649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c145454f-dbd9-46ff-b588-6c86198dc2e6","Type":"ContainerStarted","Data":"07732e3edfdde7fe7974491e73114f045ac0b3995c9449dec8aeecdc33f57299"} Jan 28 18:59:36 crc kubenswrapper[4749]: I0128 18:59:36.478337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-568fc79444-j9z48" event={"ID":"47a65467-bac6-4a3c-b634-a3cbe9d282f3","Type":"ContainerStarted","Data":"2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8"} Jan 28 18:59:36 crc kubenswrapper[4749]: I0128 18:59:36.481553 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:36 crc kubenswrapper[4749]: I0128 18:59:36.638680 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-568fc79444-j9z48" podStartSLOduration=4.6386595 podStartE2EDuration="4.6386595s" podCreationTimestamp="2026-01-28 18:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:36.577071147 +0000 UTC m=+1444.588597942" watchObservedRunningTime="2026-01-28 18:59:36.6386595 +0000 UTC m=+1444.650186275" Jan 28 18:59:36 crc kubenswrapper[4749]: I0128 18:59:36.899398 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" path="/var/lib/kubelet/pods/5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5/volumes" Jan 28 18:59:36 crc kubenswrapper[4749]: I0128 18:59:36.964524 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 18:59:36 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:59:36 crc kubenswrapper[4749]: > Jan 28 18:59:37 crc kubenswrapper[4749]: I0128 18:59:37.405799 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 28 18:59:37 crc kubenswrapper[4749]: I0128 18:59:37.408292 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.203:8080/\": dial tcp 10.217.0.203:8080: connect: connection refused" Jan 28 18:59:37 crc kubenswrapper[4749]: I0128 18:59:37.524522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" event={"ID":"f9ffa10e-04c0-4b2f-91ce-6614d405034a","Type":"ContainerStarted","Data":"2812fb5cb1f4918f47fd7f8b8d0f8a7818946b031b3fed1d58f86293259f36c6"} Jan 28 18:59:37 crc kubenswrapper[4749]: I0128 18:59:37.525281 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:37 crc kubenswrapper[4749]: I0128 18:59:37.531311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c145454f-dbd9-46ff-b588-6c86198dc2e6","Type":"ContainerStarted","Data":"cf55faf0102cf410fe630688a584f1010056176bef1de7c80831c70d79450ca5"} Jan 28 18:59:37 crc kubenswrapper[4749]: I0128 18:59:37.535936 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerStarted","Data":"81f41dea5270f00febe5e1418c94c60437f9b80add2450a56e984c8a68e0a319"} Jan 28 18:59:37 crc kubenswrapper[4749]: I0128 18:59:37.557368 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" podStartSLOduration=5.55734841 podStartE2EDuration="5.55734841s" podCreationTimestamp="2026-01-28 18:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:37.552474269 +0000 UTC m=+1445.564001064" watchObservedRunningTime="2026-01-28 18:59:37.55734841 +0000 UTC m=+1445.568875185" Jan 28 18:59:38 crc kubenswrapper[4749]: I0128 18:59:38.550007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c145454f-dbd9-46ff-b588-6c86198dc2e6","Type":"ContainerStarted","Data":"c03ba3421f51dd7f990bbf13440b787946981d66eba2ccb65fec71dac80f5d21"} Jan 28 18:59:38 crc kubenswrapper[4749]: I0128 18:59:38.550804 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 28 18:59:38 crc kubenswrapper[4749]: I0128 18:59:38.555544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerStarted","Data":"28377c98a249b2dbc6102f099da44395b0e5123ddd994b459942eb436a3d6eaf"} Jan 28 18:59:38 crc kubenswrapper[4749]: I0128 18:59:38.589697 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.589670148 podStartE2EDuration="4.589670148s" podCreationTimestamp="2026-01-28 18:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:38.572127201 +0000 UTC m=+1446.583653996" watchObservedRunningTime="2026-01-28 18:59:38.589670148 +0000 UTC m=+1446.601196933" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.143451 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-694b5876b5-phgqj"] Jan 28 18:59:39 crc kubenswrapper[4749]: E0128 18:59:39.144242 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" containerName="dnsmasq-dns" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.144259 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" containerName="dnsmasq-dns" Jan 28 18:59:39 crc kubenswrapper[4749]: E0128 18:59:39.144301 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" containerName="init" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.144311 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" containerName="init" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.144550 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef74f3e-9703-4dfd-ae3b-c3d1112f2dd5" containerName="dnsmasq-dns" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.145779 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.180319 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.180699 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.181110 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.219805 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-694b5876b5-phgqj"] Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.302066 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-config-data\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.302650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpdwb\" (UniqueName: \"kubernetes.io/projected/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-kube-api-access-hpdwb\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.302851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-etc-swift\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.303119 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-log-httpd\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.303271 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-public-tls-certs\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.303565 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-combined-ca-bundle\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.303735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-internal-tls-certs\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.303826 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-run-httpd\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.406096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpdwb\" (UniqueName: \"kubernetes.io/projected/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-kube-api-access-hpdwb\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.406235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-etc-swift\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.406289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-log-httpd\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.406308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-public-tls-certs\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.406363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-combined-ca-bundle\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.406478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-internal-tls-certs\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.406515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-run-httpd\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.406552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-config-data\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.414240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-run-httpd\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.414382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-public-tls-certs\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.415921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-log-httpd\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.428756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-combined-ca-bundle\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.429010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-config-data\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.429517 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-etc-swift\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.429814 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-internal-tls-certs\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.432953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpdwb\" (UniqueName: \"kubernetes.io/projected/4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5-kube-api-access-hpdwb\") pod \"swift-proxy-694b5876b5-phgqj\" (UID: \"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5\") " pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:39 crc kubenswrapper[4749]: I0128 18:59:39.495903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:40 crc kubenswrapper[4749]: I0128 18:59:40.577951 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.130708 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-694b5876b5-phgqj"] Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.389425 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4rdwp"] Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.392618 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.415449 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rdwp"] Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.460776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-catalog-content\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.460895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w78fj\" (UniqueName: \"kubernetes.io/projected/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-kube-api-access-w78fj\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.460928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-utilities\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.562646 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w78fj\" (UniqueName: \"kubernetes.io/projected/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-kube-api-access-w78fj\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.562705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-utilities\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.562835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-catalog-content\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.566817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-catalog-content\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.566908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-utilities\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.604738 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w78fj\" (UniqueName: \"kubernetes.io/projected/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-kube-api-access-w78fj\") pod \"redhat-marketplace-4rdwp\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.733851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerStarted","Data":"1f5d2ba6e0f4233f2795d3ab8b6db4e9c2f3b04c37e17a7039e7728f385ccdc9"} Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.735671 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.752953 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" event={"ID":"3ff11c89-e09d-40a3-ac11-4eece14288c9","Type":"ContainerStarted","Data":"1a377df682203d3246a5153e52e52e9331ff0367563f0295d3cbcafeffd0cee3"} Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.753498 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.768846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9dfc74cb7-xtw7l" event={"ID":"be080cb1-98cd-4790-8132-52818502d1a0","Type":"ContainerStarted","Data":"f742fe22d8ff1ea7a10f5755d2836f77ec6120570265207db51bb9533c093c5c"} Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.770039 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.782496 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" podStartSLOduration=4.431212939 podStartE2EDuration="9.782473127s" podCreationTimestamp="2026-01-28 18:59:32 +0000 UTC" firstStartedPulling="2026-01-28 18:59:35.010471945 +0000 UTC m=+1443.021998720" lastFinishedPulling="2026-01-28 18:59:40.361732133 +0000 UTC m=+1448.373258908" observedRunningTime="2026-01-28 18:59:41.781495662 +0000 UTC m=+1449.793022477" watchObservedRunningTime="2026-01-28 18:59:41.782473127 +0000 UTC m=+1449.793999922" Jan 28 18:59:41 crc kubenswrapper[4749]: I0128 18:59:41.821160 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-9dfc74cb7-xtw7l" podStartSLOduration=4.469243375 podStartE2EDuration="9.821142128s" podCreationTimestamp="2026-01-28 18:59:32 +0000 UTC" firstStartedPulling="2026-01-28 18:59:35.001866801 +0000 UTC m=+1443.013393576" lastFinishedPulling="2026-01-28 18:59:40.353765554 +0000 UTC m=+1448.365292329" observedRunningTime="2026-01-28 18:59:41.819147979 +0000 UTC m=+1449.830674754" watchObservedRunningTime="2026-01-28 18:59:41.821142128 +0000 UTC m=+1449.832668923" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.731404 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-74bcff4f86-hqml8"] Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.733762 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.742171 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5667478cf5-vcwtt"] Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.743815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.776579 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-74bcff4f86-hqml8"] Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.791664 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5667478cf5-vcwtt"] Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.810045 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.848878 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-778d479c94-bcw7x"] Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.850879 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.867179 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-778d479c94-bcw7x"] Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.916601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.916651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9j4\" (UniqueName: \"kubernetes.io/projected/b7c300e5-3d03-481e-bd44-85d8f7d44e72-kube-api-access-hq9j4\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.916716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data-custom\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.916805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-config-data\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.916905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-config-data-custom\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.917069 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-combined-ca-bundle\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.917153 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-combined-ca-bundle\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.917184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbqn6\" (UniqueName: \"kubernetes.io/projected/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-kube-api-access-tbqn6\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:42 crc kubenswrapper[4749]: I0128 18:59:42.921738 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9j4\" (UniqueName: \"kubernetes.io/projected/b7c300e5-3d03-481e-bd44-85d8f7d44e72-kube-api-access-hq9j4\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data-custom\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-config-data\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-config-data-custom\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-combined-ca-bundle\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-combined-ca-bundle\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019865 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data-custom\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbqn6\" (UniqueName: \"kubernetes.io/projected/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-kube-api-access-tbqn6\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.019982 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-combined-ca-bundle\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.020025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.020124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmn5p\" (UniqueName: \"kubernetes.io/projected/7fdd109b-56e0-4cf0-a763-8573984ee415-kube-api-access-dmn5p\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.020188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.026978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-config-data\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.027409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-config-data-custom\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.028020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-combined-ca-bundle\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.032074 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data-custom\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.038970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c300e5-3d03-481e-bd44-85d8f7d44e72-combined-ca-bundle\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.053915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.091159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9j4\" (UniqueName: \"kubernetes.io/projected/b7c300e5-3d03-481e-bd44-85d8f7d44e72-kube-api-access-hq9j4\") pod \"heat-engine-5667478cf5-vcwtt\" (UID: \"b7c300e5-3d03-481e-bd44-85d8f7d44e72\") " pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.104203 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.115754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbqn6\" (UniqueName: \"kubernetes.io/projected/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-kube-api-access-tbqn6\") pod \"heat-cfnapi-74bcff4f86-hqml8\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.149567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data-custom\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.149663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-combined-ca-bundle\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.149713 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.149826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmn5p\" (UniqueName: \"kubernetes.io/projected/7fdd109b-56e0-4cf0-a763-8573984ee415-kube-api-access-dmn5p\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.287636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.291760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data-custom\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.292686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmn5p\" (UniqueName: \"kubernetes.io/projected/7fdd109b-56e0-4cf0-a763-8573984ee415-kube-api-access-dmn5p\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.308236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-combined-ca-bundle\") pod \"heat-api-778d479c94-bcw7x\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.368373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.417501 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.478532 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.483295 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-l7sjv"] Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.483718 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerName="dnsmasq-dns" containerID="cri-o://1fb8426f821aecf4deb16aef03b8357aaa391d6697d11c2f6bf72f61aafe5a84" gracePeriod=10 Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.824456 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="cinder-scheduler" containerID="cri-o://d91a884ea3421785e3434792b37594dbbfc17a0d60eb9fae3f460c471d0eb3ec" gracePeriod=30 Jan 28 18:59:43 crc kubenswrapper[4749]: I0128 18:59:43.826218 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="probe" containerID="cri-o://d0cc3923523498884708fa069a1e51907aff2b85fa0b052a7343b1960f00cb3a" gracePeriod=30 Jan 28 18:59:44 crc kubenswrapper[4749]: I0128 18:59:44.842521 4749 generic.go:334] "Generic (PLEG): container finished" podID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerID="1fb8426f821aecf4deb16aef03b8357aaa391d6697d11c2f6bf72f61aafe5a84" exitCode=0 Jan 28 18:59:44 crc kubenswrapper[4749]: I0128 18:59:44.842594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" event={"ID":"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c","Type":"ContainerDied","Data":"1fb8426f821aecf4deb16aef03b8357aaa391d6697d11c2f6bf72f61aafe5a84"} Jan 28 18:59:44 crc kubenswrapper[4749]: I0128 18:59:44.955675 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: connect: connection refused" Jan 28 18:59:45 crc kubenswrapper[4749]: I0128 18:59:45.866191 4749 generic.go:334] "Generic (PLEG): container finished" podID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerID="d0cc3923523498884708fa069a1e51907aff2b85fa0b052a7343b1960f00cb3a" exitCode=0 Jan 28 18:59:45 crc kubenswrapper[4749]: I0128 18:59:45.866532 4749 generic.go:334] "Generic (PLEG): container finished" podID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerID="d91a884ea3421785e3434792b37594dbbfc17a0d60eb9fae3f460c471d0eb3ec" exitCode=0 Jan 28 18:59:45 crc kubenswrapper[4749]: I0128 18:59:45.866262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"61f9e37b-7e92-454e-8806-ab9bda2efbf7","Type":"ContainerDied","Data":"d0cc3923523498884708fa069a1e51907aff2b85fa0b052a7343b1960f00cb3a"} Jan 28 18:59:45 crc kubenswrapper[4749]: I0128 18:59:45.866571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"61f9e37b-7e92-454e-8806-ab9bda2efbf7","Type":"ContainerDied","Data":"d91a884ea3421785e3434792b37594dbbfc17a0d60eb9fae3f460c471d0eb3ec"} Jan 28 18:59:46 crc kubenswrapper[4749]: I0128 18:59:46.913292 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 18:59:46 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:59:46 crc kubenswrapper[4749]: > Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.682566 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-9dfc74cb7-xtw7l"] Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.686452 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-9dfc74cb7-xtw7l" podUID="be080cb1-98cd-4790-8132-52818502d1a0" containerName="heat-api" containerID="cri-o://f742fe22d8ff1ea7a10f5755d2836f77ec6120570265207db51bb9533c093c5c" gracePeriod=60 Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.696487 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-66cbb7db47-sh8m4"] Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.696695 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" podUID="3ff11c89-e09d-40a3-ac11-4eece14288c9" containerName="heat-cfnapi" containerID="cri-o://1a377df682203d3246a5153e52e52e9331ff0367563f0295d3cbcafeffd0cee3" gracePeriod=60 Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.699168 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-9dfc74cb7-xtw7l" podUID="be080cb1-98cd-4790-8132-52818502d1a0" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.209:8004/healthcheck\": EOF" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.710642 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7b886dbf44-fw2gl"] Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.712866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.714592 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" podUID="3ff11c89-e09d-40a3-ac11-4eece14288c9" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.208:8000/healthcheck\": EOF" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.718607 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.718712 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.728417 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-758976cb66-pthtk"] Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.730509 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.733245 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.735564 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.758429 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7b886dbf44-fw2gl"] Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.803217 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-758976cb66-pthtk"] Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-config-data\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btxsd\" (UniqueName: \"kubernetes.io/projected/df71ee0d-ef89-4e80-807c-2751810bca99-kube-api-access-btxsd\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-internal-tls-certs\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813799 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-public-tls-certs\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813826 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-config-data-custom\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-public-tls-certs\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-internal-tls-certs\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813961 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbpjc\" (UniqueName: \"kubernetes.io/projected/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-kube-api-access-hbpjc\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.813984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-combined-ca-bundle\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.814048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-config-data\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.814084 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-combined-ca-bundle\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.814111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-config-data-custom\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.915943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-config-data-custom\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-config-data\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btxsd\" (UniqueName: \"kubernetes.io/projected/df71ee0d-ef89-4e80-807c-2751810bca99-kube-api-access-btxsd\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-internal-tls-certs\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-public-tls-certs\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-config-data-custom\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-public-tls-certs\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-internal-tls-certs\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916381 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbpjc\" (UniqueName: \"kubernetes.io/projected/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-kube-api-access-hbpjc\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916406 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-combined-ca-bundle\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-config-data\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.916504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-combined-ca-bundle\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.922838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-combined-ca-bundle\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.923506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-config-data-custom\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.926116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-public-tls-certs\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.926898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-internal-tls-certs\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.927254 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-combined-ca-bundle\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.929565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-config-data\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.930188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-internal-tls-certs\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.932147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-config-data-custom\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.933406 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df71ee0d-ef89-4e80-807c-2751810bca99-config-data\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.938488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-public-tls-certs\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.942944 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btxsd\" (UniqueName: \"kubernetes.io/projected/df71ee0d-ef89-4e80-807c-2751810bca99-kube-api-access-btxsd\") pod \"heat-cfnapi-7b886dbf44-fw2gl\" (UID: \"df71ee0d-ef89-4e80-807c-2751810bca99\") " pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:47 crc kubenswrapper[4749]: I0128 18:59:47.948062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbpjc\" (UniqueName: \"kubernetes.io/projected/8a13bcdb-3c7e-487b-90a5-5f794941eb5d-kube-api-access-hbpjc\") pod \"heat-api-758976cb66-pthtk\" (UID: \"8a13bcdb-3c7e-487b-90a5-5f794941eb5d\") " pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:48 crc kubenswrapper[4749]: I0128 18:59:48.051962 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 18:59:48 crc kubenswrapper[4749]: I0128 18:59:48.076233 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-758976cb66-pthtk" Jan 28 18:59:48 crc kubenswrapper[4749]: I0128 18:59:48.401184 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:59:48 crc kubenswrapper[4749]: I0128 18:59:48.414452 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerName="glance-log" containerID="cri-o://3c117463354fb01b8ba3165872468641387f0f3541f0ffe39c4e83aaa554b672" gracePeriod=30 Jan 28 18:59:48 crc kubenswrapper[4749]: I0128 18:59:48.414602 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerName="glance-httpd" containerID="cri-o://fe96bafc836b8fe4bd89f61179e16c46df82a02f668b823efbf0da3bd28aebb7" gracePeriod=30 Jan 28 18:59:48 crc kubenswrapper[4749]: I0128 18:59:48.877552 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="c145454f-dbd9-46ff-b588-6c86198dc2e6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.211:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 18:59:48 crc kubenswrapper[4749]: I0128 18:59:48.909615 4749 generic.go:334] "Generic (PLEG): container finished" podID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerID="3c117463354fb01b8ba3165872468641387f0f3541f0ffe39c4e83aaa554b672" exitCode=143 Jan 28 18:59:48 crc kubenswrapper[4749]: I0128 18:59:48.909662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c4707cc-a6c3-4f08-870d-9d8c3a9da583","Type":"ContainerDied","Data":"3c117463354fb01b8ba3165872468641387f0f3541f0ffe39c4e83aaa554b672"} Jan 28 18:59:49 crc kubenswrapper[4749]: I0128 18:59:49.475412 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 28 18:59:49 crc kubenswrapper[4749]: I0128 18:59:49.954529 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: connect: connection refused" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.369172 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67c94bd564-mjrws" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.442232 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6czqc"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.475580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6czqc" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.481849 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6czqc"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.516918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-operator-scripts\") pod \"nova-api-db-create-6czqc\" (UID: \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\") " pod="openstack/nova-api-db-create-6czqc" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.518304 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22s59\" (UniqueName: \"kubernetes.io/projected/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-kube-api-access-22s59\") pod \"nova-api-db-create-6czqc\" (UID: \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\") " pod="openstack/nova-api-db-create-6czqc" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.536401 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.536742 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerName="glance-log" containerID="cri-o://880a7dc39a6a4a6ccdae1fd997c0bd3c477728c52d4b8bf60922ccbc63c7cd91" gracePeriod=30 Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.537424 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerName="glance-httpd" containerID="cri-o://8224af6ead75455ff44004cb6400c942a48571511762516f155cf6f0ca763836" gracePeriod=30 Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.614244 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-55c2-account-create-update-478qb"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.620515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22s59\" (UniqueName: \"kubernetes.io/projected/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-kube-api-access-22s59\") pod \"nova-api-db-create-6czqc\" (UID: \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\") " pod="openstack/nova-api-db-create-6czqc" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.620611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-operator-scripts\") pod \"nova-api-db-create-6czqc\" (UID: \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\") " pod="openstack/nova-api-db-create-6czqc" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.621718 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-operator-scripts\") pod \"nova-api-db-create-6czqc\" (UID: \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\") " pod="openstack/nova-api-db-create-6czqc" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.624014 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.630096 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.648841 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-55c2-account-create-update-478qb"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.676632 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22s59\" (UniqueName: \"kubernetes.io/projected/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-kube-api-access-22s59\") pod \"nova-api-db-create-6czqc\" (UID: \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\") " pod="openstack/nova-api-db-create-6czqc" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.723417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b5eda2-7879-473b-b45d-e5ef3d128fa4-operator-scripts\") pod \"nova-api-55c2-account-create-update-478qb\" (UID: \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\") " pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.723571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rms\" (UniqueName: \"kubernetes.io/projected/54b5eda2-7879-473b-b45d-e5ef3d128fa4-kube-api-access-z7rms\") pod \"nova-api-55c2-account-create-update-478qb\" (UID: \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\") " pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.762416 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4k5x8"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.764771 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.798038 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4k5x8"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.815717 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6czqc" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.826341 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b5eda2-7879-473b-b45d-e5ef3d128fa4-operator-scripts\") pod \"nova-api-55c2-account-create-update-478qb\" (UID: \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\") " pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.826462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7rms\" (UniqueName: \"kubernetes.io/projected/54b5eda2-7879-473b-b45d-e5ef3d128fa4-kube-api-access-z7rms\") pod \"nova-api-55c2-account-create-update-478qb\" (UID: \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\") " pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.827713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b5eda2-7879-473b-b45d-e5ef3d128fa4-operator-scripts\") pod \"nova-api-55c2-account-create-update-478qb\" (UID: \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\") " pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.848189 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-54df-account-create-update-mnv5k"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.849979 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.872015 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.878970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7rms\" (UniqueName: \"kubernetes.io/projected/54b5eda2-7879-473b-b45d-e5ef3d128fa4-kube-api-access-z7rms\") pod \"nova-api-55c2-account-create-update-478qb\" (UID: \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\") " pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.911984 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wwgd9"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.914300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.930464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-operator-scripts\") pod \"nova-cell1-db-create-wwgd9\" (UID: \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\") " pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.930676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53d974f-353d-49e8-9347-658cf61ed52b-operator-scripts\") pod \"nova-cell0-db-create-4k5x8\" (UID: \"f53d974f-353d-49e8-9347-658cf61ed52b\") " pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.930728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqr7\" (UniqueName: \"kubernetes.io/projected/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-kube-api-access-dnqr7\") pod \"nova-cell1-db-create-wwgd9\" (UID: \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\") " pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.930784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffa815c9-d525-45d3-a179-dce588ffd65d-operator-scripts\") pod \"nova-cell0-54df-account-create-update-mnv5k\" (UID: \"ffa815c9-d525-45d3-a179-dce588ffd65d\") " pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.930846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/ffa815c9-d525-45d3-a179-dce588ffd65d-kube-api-access-522kq\") pod \"nova-cell0-54df-account-create-update-mnv5k\" (UID: \"ffa815c9-d525-45d3-a179-dce588ffd65d\") " pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.930958 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75b7h\" (UniqueName: \"kubernetes.io/projected/f53d974f-353d-49e8-9347-658cf61ed52b-kube-api-access-75b7h\") pod \"nova-cell0-db-create-4k5x8\" (UID: \"f53d974f-353d-49e8-9347-658cf61ed52b\") " pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.941055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-54df-account-create-update-mnv5k"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.959416 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wwgd9"] Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.962254 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.989435 4749 generic.go:334] "Generic (PLEG): container finished" podID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerID="fe96bafc836b8fe4bd89f61179e16c46df82a02f668b823efbf0da3bd28aebb7" exitCode=0 Jan 28 18:59:51 crc kubenswrapper[4749]: I0128 18:59:51.989749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c4707cc-a6c3-4f08-870d-9d8c3a9da583","Type":"ContainerDied","Data":"fe96bafc836b8fe4bd89f61179e16c46df82a02f668b823efbf0da3bd28aebb7"} Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.006485 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerID="880a7dc39a6a4a6ccdae1fd997c0bd3c477728c52d4b8bf60922ccbc63c7cd91" exitCode=143 Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.006538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c312272-e82e-4cdc-8501-5b27b63a3ba0","Type":"ContainerDied","Data":"880a7dc39a6a4a6ccdae1fd997c0bd3c477728c52d4b8bf60922ccbc63c7cd91"} Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.021690 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-68d1-account-create-update-6vdlq"] Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.023674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.030999 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.033511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqr7\" (UniqueName: \"kubernetes.io/projected/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-kube-api-access-dnqr7\") pod \"nova-cell1-db-create-wwgd9\" (UID: \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\") " pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.033568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffa815c9-d525-45d3-a179-dce588ffd65d-operator-scripts\") pod \"nova-cell0-54df-account-create-update-mnv5k\" (UID: \"ffa815c9-d525-45d3-a179-dce588ffd65d\") " pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.033622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/ffa815c9-d525-45d3-a179-dce588ffd65d-kube-api-access-522kq\") pod \"nova-cell0-54df-account-create-update-mnv5k\" (UID: \"ffa815c9-d525-45d3-a179-dce588ffd65d\") " pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.033694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75b7h\" (UniqueName: \"kubernetes.io/projected/f53d974f-353d-49e8-9347-658cf61ed52b-kube-api-access-75b7h\") pod \"nova-cell0-db-create-4k5x8\" (UID: \"f53d974f-353d-49e8-9347-658cf61ed52b\") " pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.033774 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-operator-scripts\") pod \"nova-cell1-db-create-wwgd9\" (UID: \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\") " pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.033964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53d974f-353d-49e8-9347-658cf61ed52b-operator-scripts\") pod \"nova-cell0-db-create-4k5x8\" (UID: \"f53d974f-353d-49e8-9347-658cf61ed52b\") " pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.036930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffa815c9-d525-45d3-a179-dce588ffd65d-operator-scripts\") pod \"nova-cell0-54df-account-create-update-mnv5k\" (UID: \"ffa815c9-d525-45d3-a179-dce588ffd65d\") " pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.037263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-operator-scripts\") pod \"nova-cell1-db-create-wwgd9\" (UID: \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\") " pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.037783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53d974f-353d-49e8-9347-658cf61ed52b-operator-scripts\") pod \"nova-cell0-db-create-4k5x8\" (UID: \"f53d974f-353d-49e8-9347-658cf61ed52b\") " pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.043257 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-68d1-account-create-update-6vdlq"] Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.063026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqr7\" (UniqueName: \"kubernetes.io/projected/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-kube-api-access-dnqr7\") pod \"nova-cell1-db-create-wwgd9\" (UID: \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\") " pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.069086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75b7h\" (UniqueName: \"kubernetes.io/projected/f53d974f-353d-49e8-9347-658cf61ed52b-kube-api-access-75b7h\") pod \"nova-cell0-db-create-4k5x8\" (UID: \"f53d974f-353d-49e8-9347-658cf61ed52b\") " pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.098057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/ffa815c9-d525-45d3-a179-dce588ffd65d-kube-api-access-522kq\") pod \"nova-cell0-54df-account-create-update-mnv5k\" (UID: \"ffa815c9-d525-45d3-a179-dce588ffd65d\") " pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.169982 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfb40a95-9140-4421-8f68-ecc6870d903a-operator-scripts\") pod \"nova-cell1-68d1-account-create-update-6vdlq\" (UID: \"dfb40a95-9140-4421-8f68-ecc6870d903a\") " pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.171098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljqw6\" (UniqueName: \"kubernetes.io/projected/dfb40a95-9140-4421-8f68-ecc6870d903a-kube-api-access-ljqw6\") pod \"nova-cell1-68d1-account-create-update-6vdlq\" (UID: \"dfb40a95-9140-4421-8f68-ecc6870d903a\") " pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.439229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.440239 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljqw6\" (UniqueName: \"kubernetes.io/projected/dfb40a95-9140-4421-8f68-ecc6870d903a-kube-api-access-ljqw6\") pod \"nova-cell1-68d1-account-create-update-6vdlq\" (UID: \"dfb40a95-9140-4421-8f68-ecc6870d903a\") " pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.440407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfb40a95-9140-4421-8f68-ecc6870d903a-operator-scripts\") pod \"nova-cell1-68d1-account-create-update-6vdlq\" (UID: \"dfb40a95-9140-4421-8f68-ecc6870d903a\") " pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.441025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfb40a95-9140-4421-8f68-ecc6870d903a-operator-scripts\") pod \"nova-cell1-68d1-account-create-update-6vdlq\" (UID: \"dfb40a95-9140-4421-8f68-ecc6870d903a\") " pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.441236 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.441798 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.470644 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljqw6\" (UniqueName: \"kubernetes.io/projected/dfb40a95-9140-4421-8f68-ecc6870d903a-kube-api-access-ljqw6\") pod \"nova-cell1-68d1-account-create-update-6vdlq\" (UID: \"dfb40a95-9140-4421-8f68-ecc6870d903a\") " pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:52 crc kubenswrapper[4749]: I0128 18:59:52.656893 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:52 crc kubenswrapper[4749]: E0128 18:59:52.795479 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Jan 28 18:59:52 crc kubenswrapper[4749]: E0128 18:59:52.795900 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f7h78h6h67bh555h5d9h67fhc5h67dh5fdh5f8h564h697h646h58bh64h568hbbhc7hb6h5fh5b9h57bh94h556hd8h64bh5c4h587h5bdh694h6dq,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_CA_CERT,Value:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qklhd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(1b70372a-0994-4bb9-8369-7b00699ee7c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 18:59:52 crc kubenswrapper[4749]: E0128 18:59:52.798410 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="1b70372a-0994-4bb9-8369-7b00699ee7c0" Jan 28 18:59:53 crc kubenswrapper[4749]: I0128 18:59:53.034436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-694b5876b5-phgqj" event={"ID":"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5","Type":"ContainerStarted","Data":"6aa24e5e3fb99d93d5ddc56c43907c0baf17ee8a8ed5a4b563966f1d3a4a041c"} Jan 28 18:59:53 crc kubenswrapper[4749]: E0128 18:59:53.057342 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="1b70372a-0994-4bb9-8369-7b00699ee7c0" Jan 28 18:59:53 crc kubenswrapper[4749]: I0128 18:59:53.251344 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 18:59:53 crc kubenswrapper[4749]: I0128 18:59:53.881608 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="c145454f-dbd9-46ff-b588-6c86198dc2e6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.211:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 18:59:53 crc kubenswrapper[4749]: I0128 18:59:53.979955 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-68fb6b79f7-lxql7" Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.071902 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67c94bd564-mjrws"] Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.072375 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67c94bd564-mjrws" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerName="neutron-api" containerID="cri-o://917d85169a5b197c14265a97b8d1a2cbf02b19faa3f23321a8339221122881d0" gracePeriod=30 Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.072853 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67c94bd564-mjrws" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerName="neutron-httpd" containerID="cri-o://076ee848881ff4c5f111de5abc89f0713e2834b80064fbd9e09e8fade6b522bc" gracePeriod=30 Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.783873 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.816707 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.932946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data\") pod \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933097 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-sb\") pod \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933178 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-combined-ca-bundle\") pod \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data-custom\") pod \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-svc\") pod \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933381 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f842l\" (UniqueName: \"kubernetes.io/projected/61f9e37b-7e92-454e-8806-ab9bda2efbf7-kube-api-access-f842l\") pod \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-scripts\") pod \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-swift-storage-0\") pod \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933463 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtmcs\" (UniqueName: \"kubernetes.io/projected/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-kube-api-access-rtmcs\") pod \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-nb\") pod \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933539 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61f9e37b-7e92-454e-8806-ab9bda2efbf7-etc-machine-id\") pod \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\" (UID: \"61f9e37b-7e92-454e-8806-ab9bda2efbf7\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.933555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-config\") pod \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\" (UID: \"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c\") " Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.952775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61f9e37b-7e92-454e-8806-ab9bda2efbf7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "61f9e37b-7e92-454e-8806-ab9bda2efbf7" (UID: "61f9e37b-7e92-454e-8806-ab9bda2efbf7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.976976 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-scripts" (OuterVolumeSpecName: "scripts") pod "61f9e37b-7e92-454e-8806-ab9bda2efbf7" (UID: "61f9e37b-7e92-454e-8806-ab9bda2efbf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.981734 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-kube-api-access-rtmcs" (OuterVolumeSpecName: "kube-api-access-rtmcs") pod "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" (UID: "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c"). InnerVolumeSpecName "kube-api-access-rtmcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:54 crc kubenswrapper[4749]: I0128 18:59:54.982520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f9e37b-7e92-454e-8806-ab9bda2efbf7-kube-api-access-f842l" (OuterVolumeSpecName: "kube-api-access-f842l") pod "61f9e37b-7e92-454e-8806-ab9bda2efbf7" (UID: "61f9e37b-7e92-454e-8806-ab9bda2efbf7"). InnerVolumeSpecName "kube-api-access-f842l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:54.995515 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "61f9e37b-7e92-454e-8806-ab9bda2efbf7" (UID: "61f9e37b-7e92-454e-8806-ab9bda2efbf7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.048677 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.048861 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.048874 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f842l\" (UniqueName: \"kubernetes.io/projected/61f9e37b-7e92-454e-8806-ab9bda2efbf7-kube-api-access-f842l\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.048887 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtmcs\" (UniqueName: \"kubernetes.io/projected/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-kube-api-access-rtmcs\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.048898 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61f9e37b-7e92-454e-8806-ab9bda2efbf7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.095230 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" (UID: "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.113855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-config" (OuterVolumeSpecName: "config") pod "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" (UID: "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.115258 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-9dfc74cb7-xtw7l" podUID="be080cb1-98cd-4790-8132-52818502d1a0" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.209:8004/healthcheck\": read tcp 10.217.0.2:57540->10.217.0.209:8004: read: connection reset by peer" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.116046 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-9dfc74cb7-xtw7l" podUID="be080cb1-98cd-4790-8132-52818502d1a0" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.209:8004/healthcheck\": dial tcp 10.217.0.209:8004: connect: connection refused" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.129738 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="ceilometer-central-agent" containerID="cri-o://81f41dea5270f00febe5e1418c94c60437f9b80add2450a56e984c8a68e0a319" gracePeriod=30 Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.131219 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="proxy-httpd" containerID="cri-o://480f46ce4e474b47094116652cd0aa94cc8ea71c261898a28da6379393ce8a68" gracePeriod=30 Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.131461 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="sg-core" containerID="cri-o://1f5d2ba6e0f4233f2795d3ab8b6db4e9c2f3b04c37e17a7039e7728f385ccdc9" gracePeriod=30 Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.132085 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="ceilometer-notification-agent" containerID="cri-o://28377c98a249b2dbc6102f099da44395b0e5123ddd994b459942eb436a3d6eaf" gracePeriod=30 Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.147476 4749 generic.go:334] "Generic (PLEG): container finished" podID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerID="076ee848881ff4c5f111de5abc89f0713e2834b80064fbd9e09e8fade6b522bc" exitCode=0 Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.153676 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-config\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.153715 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.178943 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerID="8224af6ead75455ff44004cb6400c942a48571511762516f155cf6f0ca763836" exitCode=0 Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.179969 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.875633575 podStartE2EDuration="22.179949805s" podCreationTimestamp="2026-01-28 18:59:33 +0000 UTC" firstStartedPulling="2026-01-28 18:59:35.168405466 +0000 UTC m=+1443.179932251" lastFinishedPulling="2026-01-28 18:59:54.472721706 +0000 UTC m=+1462.484248481" observedRunningTime="2026-01-28 18:59:55.166890619 +0000 UTC m=+1463.178417404" watchObservedRunningTime="2026-01-28 18:59:55.179949805 +0000 UTC m=+1463.191476580" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.182081 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.184320 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.197198 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" podUID="3ff11c89-e09d-40a3-ac11-4eece14288c9" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.208:8000/healthcheck\": read tcp 10.217.0.2:55154->10.217.0.208:8000: read: connection reset by peer" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.198005 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" podUID="3ff11c89-e09d-40a3-ac11-4eece14288c9" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.208:8000/healthcheck\": dial tcp 10.217.0.208:8000: connect: connection refused" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.201204 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" (UID: "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.256632 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-694b5876b5-phgqj" podStartSLOduration=16.256609092 podStartE2EDuration="16.256609092s" podCreationTimestamp="2026-01-28 18:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:55.238302666 +0000 UTC m=+1463.249829451" watchObservedRunningTime="2026-01-28 18:59:55.256609092 +0000 UTC m=+1463.268135867" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.284071 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.353574 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" (UID: "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.380159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61f9e37b-7e92-454e-8806-ab9bda2efbf7" (UID: "61f9e37b-7e92-454e-8806-ab9bda2efbf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.388260 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.388287 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.408070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" (UID: "85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433217 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433564 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433608 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerStarted","Data":"480f46ce4e474b47094116652cd0aa94cc8ea71c261898a28da6379393ce8a68"} Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c94bd564-mjrws" event={"ID":"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc","Type":"ContainerDied","Data":"076ee848881ff4c5f111de5abc89f0713e2834b80064fbd9e09e8fade6b522bc"} Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c312272-e82e-4cdc-8501-5b27b63a3ba0","Type":"ContainerDied","Data":"8224af6ead75455ff44004cb6400c942a48571511762516f155cf6f0ca763836"} Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"61f9e37b-7e92-454e-8806-ab9bda2efbf7","Type":"ContainerDied","Data":"6caf918f72f8b9dfaca988b83256e846afd8601487b781783c660cdd29542423"} Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-l7sjv" event={"ID":"85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c","Type":"ContainerDied","Data":"6e3fc7b90f4ca072ce01a1845d87482ace1ee5d10e2a74794725c3fcd787c0ad"} Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-694b5876b5-phgqj" event={"ID":"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5","Type":"ContainerStarted","Data":"590059963b084effa612e3925ca637cc0c9a6ad914529da3327e3b61787f0327"} Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-694b5876b5-phgqj" event={"ID":"4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5","Type":"ContainerStarted","Data":"04efdd7981aba833c137002a205119d25ff8f0d487c23ad3f667863e9cd494df"} Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.433776 4749 scope.go:117] "RemoveContainer" containerID="d0cc3923523498884708fa069a1e51907aff2b85fa0b052a7343b1960f00cb3a" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.448882 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.491077 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.554465 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data" (OuterVolumeSpecName: "config-data") pod "61f9e37b-7e92-454e-8806-ab9bda2efbf7" (UID: "61f9e37b-7e92-454e-8806-ab9bda2efbf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.593115 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-scripts\") pod \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.597657 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-l7sjv"] Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598276 4749 scope.go:117] "RemoveContainer" containerID="d91a884ea3421785e3434792b37594dbbfc17a0d60eb9fae3f460c471d0eb3ec" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598263 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-scripts" (OuterVolumeSpecName: "scripts") pod "9c4707cc-a6c3-4f08-870d-9d8c3a9da583" (UID: "9c4707cc-a6c3-4f08-870d-9d8c3a9da583"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598494 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-logs\") pod \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz5zn\" (UniqueName: \"kubernetes.io/projected/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-kube-api-access-bz5zn\") pod \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598703 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-public-tls-certs\") pod \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-config-data\") pod \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-combined-ca-bundle\") pod \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.598919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-httpd-run\") pod \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\" (UID: \"9c4707cc-a6c3-4f08-870d-9d8c3a9da583\") " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.599621 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f9e37b-7e92-454e-8806-ab9bda2efbf7-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.599633 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.599896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9c4707cc-a6c3-4f08-870d-9d8c3a9da583" (UID: "9c4707cc-a6c3-4f08-870d-9d8c3a9da583"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.601936 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-kube-api-access-bz5zn" (OuterVolumeSpecName: "kube-api-access-bz5zn") pod "9c4707cc-a6c3-4f08-870d-9d8c3a9da583" (UID: "9c4707cc-a6c3-4f08-870d-9d8c3a9da583"). InnerVolumeSpecName "kube-api-access-bz5zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.602656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-logs" (OuterVolumeSpecName: "logs") pod "9c4707cc-a6c3-4f08-870d-9d8c3a9da583" (UID: "9c4707cc-a6c3-4f08-870d-9d8c3a9da583"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.630047 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-l7sjv"] Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.668626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-74bcff4f86-hqml8"] Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.672367 4749 scope.go:117] "RemoveContainer" containerID="1fb8426f821aecf4deb16aef03b8357aaa391d6697d11c2f6bf72f61aafe5a84" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.675065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be" (OuterVolumeSpecName: "glance") pod "9c4707cc-a6c3-4f08-870d-9d8c3a9da583" (UID: "9c4707cc-a6c3-4f08-870d-9d8c3a9da583"). InnerVolumeSpecName "pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.692052 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-config-data" (OuterVolumeSpecName: "config-data") pod "9c4707cc-a6c3-4f08-870d-9d8c3a9da583" (UID: "9c4707cc-a6c3-4f08-870d-9d8c3a9da583"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.700162 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-68d1-account-create-update-6vdlq"] Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.706908 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.706935 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.706958 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") on node \"crc\" " Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.706970 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-logs\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.706981 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz5zn\" (UniqueName: \"kubernetes.io/projected/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-kube-api-access-bz5zn\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.709817 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c4707cc-a6c3-4f08-870d-9d8c3a9da583" (UID: "9c4707cc-a6c3-4f08-870d-9d8c3a9da583"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.725528 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-54df-account-create-update-mnv5k"] Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.742222 4749 scope.go:117] "RemoveContainer" containerID="dc373c830b501a6dce3e2a8d0cda0baded650fde68e4961128391a502b950f24" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.742423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9c4707cc-a6c3-4f08-870d-9d8c3a9da583" (UID: "9c4707cc-a6c3-4f08-870d-9d8c3a9da583"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:55 crc kubenswrapper[4749]: W0128 18:59:55.754121 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffa815c9_d525_45d3_a179_dce588ffd65d.slice/crio-55b721ce53d5eedea3b64feba41626c2549d530b285364e4183e2a444b50800c WatchSource:0}: Error finding container 55b721ce53d5eedea3b64feba41626c2549d530b285364e4183e2a444b50800c: Status 404 returned error can't find the container with id 55b721ce53d5eedea3b64feba41626c2549d530b285364e4183e2a444b50800c Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.790714 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.797043 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be") on node "crc" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.820504 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.820811 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.821721 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c4707cc-a6c3-4f08-870d-9d8c3a9da583-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.883848 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.939003 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.960503 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:55 crc kubenswrapper[4749]: E0128 18:59:55.961078 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerName="dnsmasq-dns" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961095 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerName="dnsmasq-dns" Jan 28 18:59:55 crc kubenswrapper[4749]: E0128 18:59:55.961118 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerName="glance-httpd" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961124 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerName="glance-httpd" Jan 28 18:59:55 crc kubenswrapper[4749]: E0128 18:59:55.961131 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="probe" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961139 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="probe" Jan 28 18:59:55 crc kubenswrapper[4749]: E0128 18:59:55.961158 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerName="glance-log" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961164 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerName="glance-log" Jan 28 18:59:55 crc kubenswrapper[4749]: E0128 18:59:55.961182 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="cinder-scheduler" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961188 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="cinder-scheduler" Jan 28 18:59:55 crc kubenswrapper[4749]: E0128 18:59:55.961197 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerName="init" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961203 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerName="init" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961440 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="probe" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961453 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerName="glance-httpd" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961468 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" containerName="glance-log" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961477 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" containerName="dnsmasq-dns" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.961485 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" containerName="cinder-scheduler" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.962782 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 18:59:55 crc kubenswrapper[4749]: I0128 18:59:55.965384 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.001458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.132840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk4cr\" (UniqueName: \"kubernetes.io/projected/f31cc43a-6cff-4aec-8306-e6d0eb59f973-kube-api-access-bk4cr\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.132920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.132950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-scripts\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.132970 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.133036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f31cc43a-6cff-4aec-8306-e6d0eb59f973-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.133102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-config-data\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.261588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f31cc43a-6cff-4aec-8306-e6d0eb59f973-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.261682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-config-data\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.262241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk4cr\" (UniqueName: \"kubernetes.io/projected/f31cc43a-6cff-4aec-8306-e6d0eb59f973-kube-api-access-bk4cr\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.262358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.262387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-scripts\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.262410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.266018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f31cc43a-6cff-4aec-8306-e6d0eb59f973-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.286692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.290393 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-config-data\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.292229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-scripts\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.294815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f31cc43a-6cff-4aec-8306-e6d0eb59f973-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.305041 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk4cr\" (UniqueName: \"kubernetes.io/projected/f31cc43a-6cff-4aec-8306-e6d0eb59f973-kube-api-access-bk4cr\") pod \"cinder-scheduler-0\" (UID: \"f31cc43a-6cff-4aec-8306-e6d0eb59f973\") " pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.322672 4749 generic.go:334] "Generic (PLEG): container finished" podID="26a1df38-b5be-4122-a907-fadff9ea2487" containerID="1f5d2ba6e0f4233f2795d3ab8b6db4e9c2f3b04c37e17a7039e7728f385ccdc9" exitCode=2 Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.323029 4749 generic.go:334] "Generic (PLEG): container finished" podID="26a1df38-b5be-4122-a907-fadff9ea2487" containerID="81f41dea5270f00febe5e1418c94c60437f9b80add2450a56e984c8a68e0a319" exitCode=0 Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.323113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerDied","Data":"1f5d2ba6e0f4233f2795d3ab8b6db4e9c2f3b04c37e17a7039e7728f385ccdc9"} Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.323146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerDied","Data":"81f41dea5270f00febe5e1418c94c60437f9b80add2450a56e984c8a68e0a319"} Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.364068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" event={"ID":"dfb40a95-9140-4421-8f68-ecc6870d903a","Type":"ContainerStarted","Data":"f63ac6f32ed42f2923c7f72e9b8445760a17f0b6e087a091cb85526bbf1b5ac5"} Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.415669 4749 generic.go:334] "Generic (PLEG): container finished" podID="3ff11c89-e09d-40a3-ac11-4eece14288c9" containerID="1a377df682203d3246a5153e52e52e9331ff0367563f0295d3cbcafeffd0cee3" exitCode=0 Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.415762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" event={"ID":"3ff11c89-e09d-40a3-ac11-4eece14288c9","Type":"ContainerDied","Data":"1a377df682203d3246a5153e52e52e9331ff0367563f0295d3cbcafeffd0cee3"} Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.439546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9dfc74cb7-xtw7l" event={"ID":"be080cb1-98cd-4790-8132-52818502d1a0","Type":"ContainerDied","Data":"f742fe22d8ff1ea7a10f5755d2836f77ec6120570265207db51bb9533c093c5c"} Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.439496 4749 generic.go:334] "Generic (PLEG): container finished" podID="be080cb1-98cd-4790-8132-52818502d1a0" containerID="f742fe22d8ff1ea7a10f5755d2836f77ec6120570265207db51bb9533c093c5c" exitCode=0 Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.453075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" event={"ID":"b6dba594-df49-48ca-9cdf-5ccb4a76d74c","Type":"ContainerStarted","Data":"de2191ebb312e58cd3c7f1281361ac7fd8ebdd7276481b7dc030961dccf16524"} Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.456045 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-54df-account-create-update-mnv5k" event={"ID":"ffa815c9-d525-45d3-a179-dce588ffd65d","Type":"ContainerStarted","Data":"55b721ce53d5eedea3b64feba41626c2549d530b285364e4183e2a444b50800c"} Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.475671 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.491235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c4707cc-a6c3-4f08-870d-9d8c3a9da583","Type":"ContainerDied","Data":"0c42db171e0f95a210f1c598b45708754b6ed5d18e2da9b9b34d7e420dab63e9"} Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.491300 4749 scope.go:117] "RemoveContainer" containerID="fe96bafc836b8fe4bd89f61179e16c46df82a02f668b823efbf0da3bd28aebb7" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.491588 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.499106 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-54df-account-create-update-mnv5k" podStartSLOduration=5.49909149 podStartE2EDuration="5.49909149s" podCreationTimestamp="2026-01-28 18:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:56.49589572 +0000 UTC m=+1464.507422495" watchObservedRunningTime="2026-01-28 18:59:56.49909149 +0000 UTC m=+1464.510618265" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.570636 4749 scope.go:117] "RemoveContainer" containerID="3c117463354fb01b8ba3165872468641387f0f3541f0ffe39c4e83aaa554b672" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.606879 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.645772 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.660892 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.663747 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:59:56 crc kubenswrapper[4749]: E0128 18:59:56.664222 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerName="glance-httpd" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.664235 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerName="glance-httpd" Jan 28 18:59:56 crc kubenswrapper[4749]: E0128 18:59:56.664256 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerName="glance-log" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.664262 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerName="glance-log" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.664541 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerName="glance-log" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.664563 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" containerName="glance-httpd" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.674362 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.674464 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.677485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.677613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-combined-ca-bundle\") pod \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.677712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-config-data\") pod \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.677746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-httpd-run\") pod \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.677793 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcrjm\" (UniqueName: \"kubernetes.io/projected/6c312272-e82e-4cdc-8501-5b27b63a3ba0-kube-api-access-gcrjm\") pod \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.677876 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-logs\") pod \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.677921 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-internal-tls-certs\") pod \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.677959 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-scripts\") pod \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\" (UID: \"6c312272-e82e-4cdc-8501-5b27b63a3ba0\") " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.680908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-logs" (OuterVolumeSpecName: "logs") pod "6c312272-e82e-4cdc-8501-5b27b63a3ba0" (UID: "6c312272-e82e-4cdc-8501-5b27b63a3ba0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.682085 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.682342 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.682463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6c312272-e82e-4cdc-8501-5b27b63a3ba0" (UID: "6c312272-e82e-4cdc-8501-5b27b63a3ba0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.692102 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c312272-e82e-4cdc-8501-5b27b63a3ba0-kube-api-access-gcrjm" (OuterVolumeSpecName: "kube-api-access-gcrjm") pod "6c312272-e82e-4cdc-8501-5b27b63a3ba0" (UID: "6c312272-e82e-4cdc-8501-5b27b63a3ba0"). InnerVolumeSpecName "kube-api-access-gcrjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.692239 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-scripts" (OuterVolumeSpecName: "scripts") pod "6c312272-e82e-4cdc-8501-5b27b63a3ba0" (UID: "6c312272-e82e-4cdc-8501-5b27b63a3ba0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.771135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63" (OuterVolumeSpecName: "glance") pod "6c312272-e82e-4cdc-8501-5b27b63a3ba0" (UID: "6c312272-e82e-4cdc-8501-5b27b63a3ba0"). InnerVolumeSpecName "pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.790274 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcrjm\" (UniqueName: \"kubernetes.io/projected/6c312272-e82e-4cdc-8501-5b27b63a3ba0-kube-api-access-gcrjm\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.790306 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-logs\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.790316 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.790362 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") on node \"crc\" " Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.790373 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c312272-e82e-4cdc-8501-5b27b63a3ba0-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.865538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-config-data" (OuterVolumeSpecName: "config-data") pod "6c312272-e82e-4cdc-8501-5b27b63a3ba0" (UID: "6c312272-e82e-4cdc-8501-5b27b63a3ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.869348 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c312272-e82e-4cdc-8501-5b27b63a3ba0" (UID: "6c312272-e82e-4cdc-8501-5b27b63a3ba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.886562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6c312272-e82e-4cdc-8501-5b27b63a3ba0" (UID: "6c312272-e82e-4cdc-8501-5b27b63a3ba0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.948093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.948218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-config-data\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.948259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-scripts\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.948394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.948430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.948533 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.948614 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z97vs\" (UniqueName: \"kubernetes.io/projected/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-kube-api-access-z97vs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.948675 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-logs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.949136 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.949163 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.949176 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c312272-e82e-4cdc-8501-5b27b63a3ba0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.949941 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 18:59:56 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 18:59:56 crc kubenswrapper[4749]: > Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.967101 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 18:59:56 crc kubenswrapper[4749]: I0128 18:59:56.967567 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63") on node "crc" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.055539 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f9e37b-7e92-454e-8806-ab9bda2efbf7" path="/var/lib/kubelet/pods/61f9e37b-7e92-454e-8806-ab9bda2efbf7/volumes" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.067765 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c" path="/var/lib/kubelet/pods/85a7b645-fc7f-4a3a-a7d3-39b0d7248e5c/volumes" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.076481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z97vs\" (UniqueName: \"kubernetes.io/projected/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-kube-api-access-z97vs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.077618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-logs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.077754 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4707cc-a6c3-4f08-870d-9d8c3a9da583" path="/var/lib/kubelet/pods/9c4707cc-a6c3-4f08-870d-9d8c3a9da583/volumes" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.099827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-logs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.106260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.106461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-config-data\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.106735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-scripts\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.106819 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.106846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.106944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.107094 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.107262 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-55c2-account-create-update-478qb"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.107303 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6czqc"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.108547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.112284 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z97vs\" (UniqueName: \"kubernetes.io/projected/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-kube-api-access-z97vs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.112903 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.112953 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/90c2183eeb61266815926dade9237ac0e2270af237c5c71fcfec8d480d6530dc/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.113210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.113685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-config-data\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.128977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-scripts\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.129717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39b5ea6-cb49-4c58-85c4-f9b274ec979b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.134155 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-758976cb66-pthtk"] Jan 28 18:59:57 crc kubenswrapper[4749]: W0128 18:59:57.155757 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0f8e837_ecac_47c7_9a8e_204d9bbbc42b.slice/crio-efa69aeb8a14dbf4f25a01d0ca268744e32cf9b4ab249ec468dd64bcdc52b60f WatchSource:0}: Error finding container efa69aeb8a14dbf4f25a01d0ca268744e32cf9b4ab249ec468dd64bcdc52b60f: Status 404 returned error can't find the container with id efa69aeb8a14dbf4f25a01d0ca268744e32cf9b4ab249ec468dd64bcdc52b60f Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.166497 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-778d479c94-bcw7x"] Jan 28 18:59:57 crc kubenswrapper[4749]: W0128 18:59:57.177465 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b3182d1_16fe_490b_80d1_b0d3445cf1c8.slice/crio-58e50dc72e533f73486339212ac2c8114d9e0f36c8f6014cff97bafafea5768f WatchSource:0}: Error finding container 58e50dc72e533f73486339212ac2c8114d9e0f36c8f6014cff97bafafea5768f: Status 404 returned error can't find the container with id 58e50dc72e533f73486339212ac2c8114d9e0f36c8f6014cff97bafafea5768f Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.251309 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.253564 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4k5x8"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.273061 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7b886dbf44-fw2gl"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.274431 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.290012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rdwp"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.315848 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5667478cf5-vcwtt"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.333204 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wwgd9"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.366960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ad035bc-0373-4c28-aeb0-962c3c1a85be\") pod \"glance-default-external-api-0\" (UID: \"a39b5ea6-cb49-4c58-85c4-f9b274ec979b\") " pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.442423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-combined-ca-bundle\") pod \"be080cb1-98cd-4790-8132-52818502d1a0\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.442604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhc9l\" (UniqueName: \"kubernetes.io/projected/3ff11c89-e09d-40a3-ac11-4eece14288c9-kube-api-access-lhc9l\") pod \"3ff11c89-e09d-40a3-ac11-4eece14288c9\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.442673 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data\") pod \"be080cb1-98cd-4790-8132-52818502d1a0\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.442727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data\") pod \"3ff11c89-e09d-40a3-ac11-4eece14288c9\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.442763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data-custom\") pod \"be080cb1-98cd-4790-8132-52818502d1a0\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.442923 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-combined-ca-bundle\") pod \"3ff11c89-e09d-40a3-ac11-4eece14288c9\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.443078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27m79\" (UniqueName: \"kubernetes.io/projected/be080cb1-98cd-4790-8132-52818502d1a0-kube-api-access-27m79\") pod \"be080cb1-98cd-4790-8132-52818502d1a0\" (UID: \"be080cb1-98cd-4790-8132-52818502d1a0\") " Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.443123 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data-custom\") pod \"3ff11c89-e09d-40a3-ac11-4eece14288c9\" (UID: \"3ff11c89-e09d-40a3-ac11-4eece14288c9\") " Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.451166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ff11c89-e09d-40a3-ac11-4eece14288c9" (UID: "3ff11c89-e09d-40a3-ac11-4eece14288c9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.473403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff11c89-e09d-40a3-ac11-4eece14288c9-kube-api-access-lhc9l" (OuterVolumeSpecName: "kube-api-access-lhc9l") pod "3ff11c89-e09d-40a3-ac11-4eece14288c9" (UID: "3ff11c89-e09d-40a3-ac11-4eece14288c9"). InnerVolumeSpecName "kube-api-access-lhc9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.474789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be080cb1-98cd-4790-8132-52818502d1a0-kube-api-access-27m79" (OuterVolumeSpecName: "kube-api-access-27m79") pod "be080cb1-98cd-4790-8132-52818502d1a0" (UID: "be080cb1-98cd-4790-8132-52818502d1a0"). InnerVolumeSpecName "kube-api-access-27m79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.505642 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be080cb1-98cd-4790-8132-52818502d1a0" (UID: "be080cb1-98cd-4790-8132-52818502d1a0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.544667 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be080cb1-98cd-4790-8132-52818502d1a0" (UID: "be080cb1-98cd-4790-8132-52818502d1a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.544922 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffa815c9-d525-45d3-a179-dce588ffd65d" containerID="ff087e09a40e738faed05bb60d542d39ad374e0b9eb49c055e6252ae1ec69cfd" exitCode=0 Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.545009 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-54df-account-create-update-mnv5k" event={"ID":"ffa815c9-d525-45d3-a179-dce588ffd65d","Type":"ContainerDied","Data":"ff087e09a40e738faed05bb60d542d39ad374e0b9eb49c055e6252ae1ec69cfd"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.553739 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27m79\" (UniqueName: \"kubernetes.io/projected/be080cb1-98cd-4790-8132-52818502d1a0-kube-api-access-27m79\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.553782 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.553791 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.553803 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhc9l\" (UniqueName: \"kubernetes.io/projected/3ff11c89-e09d-40a3-ac11-4eece14288c9-kube-api-access-lhc9l\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.553812 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.568698 4749 generic.go:334] "Generic (PLEG): container finished" podID="dfb40a95-9140-4421-8f68-ecc6870d903a" containerID="d07a18bba8c99d44c5c8e3774f5f83558a74c9473d2a3c63dd4b63e688fcaba1" exitCode=0 Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.568772 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" event={"ID":"dfb40a95-9140-4421-8f68-ecc6870d903a","Type":"ContainerDied","Data":"d07a18bba8c99d44c5c8e3774f5f83558a74c9473d2a3c63dd4b63e688fcaba1"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.571871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" event={"ID":"3ff11c89-e09d-40a3-ac11-4eece14288c9","Type":"ContainerDied","Data":"48e815ac6521b4271d5d6dedebdb37a6fffa4f109be87dff46bc59f1757ecd15"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.571920 4749 scope.go:117] "RemoveContainer" containerID="1a377df682203d3246a5153e52e52e9331ff0367563f0295d3cbcafeffd0cee3" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.572047 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-66cbb7db47-sh8m4" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.588368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c312272-e82e-4cdc-8501-5b27b63a3ba0","Type":"ContainerDied","Data":"1fca54a7a8cce83a7a0a869c188b31c91edf8a4fbdec474357d56abd7a3093d4"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.588483 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.614233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-778d479c94-bcw7x" event={"ID":"7fdd109b-56e0-4cf0-a763-8573984ee415","Type":"ContainerStarted","Data":"9d53c2845385117874295acf2b165e3550e8acb1fbfc54fc24ce15ac18ae3214"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.628396 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.633595 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.634249 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.646709 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4k5x8" event={"ID":"f53d974f-353d-49e8-9347-658cf61ed52b","Type":"ContainerStarted","Data":"3d9e5be952ff8d7415bd3ca75394b4de51112f7bdf43b7f266cd7fb85a34af52"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.650737 4749 generic.go:334] "Generic (PLEG): container finished" podID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" containerID="b77987f8cf360347e2724320def6443d616088d3e1fc0398dc8804537dfd5912" exitCode=1 Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.650853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" event={"ID":"b6dba594-df49-48ca-9cdf-5ccb4a76d74c","Type":"ContainerDied","Data":"b77987f8cf360347e2724320def6443d616088d3e1fc0398dc8804537dfd5912"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.651484 4749 scope.go:117] "RemoveContainer" containerID="b77987f8cf360347e2724320def6443d616088d3e1fc0398dc8804537dfd5912" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.664989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-55c2-account-create-update-478qb" event={"ID":"54b5eda2-7879-473b-b45d-e5ef3d128fa4","Type":"ContainerStarted","Data":"2ebbd860d9db2a224975ff5290ff9e8066b8537386b8c9948e7b545c8f6aad71"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.681730 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" event={"ID":"df71ee0d-ef89-4e80-807c-2751810bca99","Type":"ContainerStarted","Data":"4ce599bed7fe80141e386ace54e79a23f0ce658481f69b4f24cabe8e0d9737b4"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.695273 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:59:57 crc kubenswrapper[4749]: E0128 18:59:57.695885 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be080cb1-98cd-4790-8132-52818502d1a0" containerName="heat-api" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.695913 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="be080cb1-98cd-4790-8132-52818502d1a0" containerName="heat-api" Jan 28 18:59:57 crc kubenswrapper[4749]: E0128 18:59:57.695956 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff11c89-e09d-40a3-ac11-4eece14288c9" containerName="heat-cfnapi" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.695964 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff11c89-e09d-40a3-ac11-4eece14288c9" containerName="heat-cfnapi" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.696214 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="be080cb1-98cd-4790-8132-52818502d1a0" containerName="heat-api" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.696234 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff11c89-e09d-40a3-ac11-4eece14288c9" containerName="heat-cfnapi" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.701938 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.710746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwgd9" event={"ID":"5b3182d1-16fe-490b-80d1-b0d3445cf1c8","Type":"ContainerStarted","Data":"58e50dc72e533f73486339212ac2c8114d9e0f36c8f6014cff97bafafea5768f"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.711809 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.712057 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.720819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9dfc74cb7-xtw7l" event={"ID":"be080cb1-98cd-4790-8132-52818502d1a0","Type":"ContainerDied","Data":"685518c8221fb62585c1bff29ae716bf513e7339e6cc4046deeb647f748d79db"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.720898 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9dfc74cb7-xtw7l" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.723787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5667478cf5-vcwtt" event={"ID":"b7c300e5-3d03-481e-bd44-85d8f7d44e72","Type":"ContainerStarted","Data":"82d73f62ce898ae05a74fe78f8ec11ae82f7ed752a29937db466317ecb20dee2"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.726538 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.726575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rdwp" event={"ID":"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1","Type":"ContainerStarted","Data":"867e0329e8c76b080c3973010bf3947447300f85b4cf23dcbbf92bd91b2e6f5b"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.745809 4749 generic.go:334] "Generic (PLEG): container finished" podID="26a1df38-b5be-4122-a907-fadff9ea2487" containerID="28377c98a249b2dbc6102f099da44395b0e5123ddd994b459942eb436a3d6eaf" exitCode=0 Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.745881 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerDied","Data":"28377c98a249b2dbc6102f099da44395b0e5123ddd994b459942eb436a3d6eaf"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.748746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6czqc" event={"ID":"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b","Type":"ContainerStarted","Data":"efa69aeb8a14dbf4f25a01d0ca268744e32cf9b4ab249ec468dd64bcdc52b60f"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.750101 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-758976cb66-pthtk" event={"ID":"8a13bcdb-3c7e-487b-90a5-5f794941eb5d","Type":"ContainerStarted","Data":"c4e2426444dc34d24134a30250c3496a8ce5403cf222cd3cc92a93cd5cb1a76e"} Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.784403 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.881815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.881940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c07eea-b699-4b39-b6db-ad0a9536ebe4-logs\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.882009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.882038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35c07eea-b699-4b39-b6db-ad0a9536ebe4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.882067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.882267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.882390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.882449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spft8\" (UniqueName: \"kubernetes.io/projected/35c07eea-b699-4b39-b6db-ad0a9536ebe4-kube-api-access-spft8\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: E0128 18:59:57.898580 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c312272_e82e_4cdc_8501_5b27b63a3ba0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb40a95_9140_4421_8f68_ecc6870d903a.slice/crio-d07a18bba8c99d44c5c8e3774f5f83558a74c9473d2a3c63dd4b63e688fcaba1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c312272_e82e_4cdc_8501_5b27b63a3ba0.slice/crio-1fca54a7a8cce83a7a0a869c188b31c91edf8a4fbdec474357d56abd7a3093d4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffa815c9_d525_45d3_a179_dce588ffd65d.slice/crio-conmon-ff087e09a40e738faed05bb60d542d39ad374e0b9eb49c055e6252ae1ec69cfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffa815c9_d525_45d3_a179_dce588ffd65d.slice/crio-ff087e09a40e738faed05bb60d542d39ad374e0b9eb49c055e6252ae1ec69cfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb40a95_9140_4421_8f68_ecc6870d903a.slice/crio-conmon-d07a18bba8c99d44c5c8e3774f5f83558a74c9473d2a3c63dd4b63e688fcaba1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6dba594_df49_48ca_9cdf_5ccb4a76d74c.slice/crio-b77987f8cf360347e2724320def6443d616088d3e1fc0398dc8804537dfd5912.scope\": RecentStats: unable to find data in memory cache]" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.973927 4749 scope.go:117] "RemoveContainer" containerID="8224af6ead75455ff44004cb6400c942a48571511762516f155cf6f0ca763836" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.985066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.985182 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.985242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spft8\" (UniqueName: \"kubernetes.io/projected/35c07eea-b699-4b39-b6db-ad0a9536ebe4-kube-api-access-spft8\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.985281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.985396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c07eea-b699-4b39-b6db-ad0a9536ebe4-logs\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.985460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.985492 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35c07eea-b699-4b39-b6db-ad0a9536ebe4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.985525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.988050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35c07eea-b699-4b39-b6db-ad0a9536ebe4-logs\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:57 crc kubenswrapper[4749]: I0128 18:59:57.990817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35c07eea-b699-4b39-b6db-ad0a9536ebe4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.015279 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.016171 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.034533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.035394 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.035417 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/71cf6cfdb445cd2c33acb8ae4dd1b724414e52d8c80d6fd9f949055d3c3fc984/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.035661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c07eea-b699-4b39-b6db-ad0a9536ebe4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.039336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spft8\" (UniqueName: \"kubernetes.io/projected/35c07eea-b699-4b39-b6db-ad0a9536ebe4-kube-api-access-spft8\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.374920 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.375278 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.523830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data" (OuterVolumeSpecName: "config-data") pod "be080cb1-98cd-4790-8132-52818502d1a0" (UID: "be080cb1-98cd-4790-8132-52818502d1a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.576268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ff11c89-e09d-40a3-ac11-4eece14288c9" (UID: "3ff11c89-e09d-40a3-ac11-4eece14288c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.609369 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.609396 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be080cb1-98cd-4790-8132-52818502d1a0-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.786181 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4k5x8" event={"ID":"f53d974f-353d-49e8-9347-658cf61ed52b","Type":"ContainerStarted","Data":"d8413800bacc7001e486a90301643948645e44db627651c5e1d9f1588ec6c17c"} Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.789282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67bceac6-4ed4-4452-9055-41fcf9cc5b63\") pod \"glance-default-internal-api-0\" (UID: \"35c07eea-b699-4b39-b6db-ad0a9536ebe4\") " pod="openstack/glance-default-internal-api-0" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.803394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data" (OuterVolumeSpecName: "config-data") pod "3ff11c89-e09d-40a3-ac11-4eece14288c9" (UID: "3ff11c89-e09d-40a3-ac11-4eece14288c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.811014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-55c2-account-create-update-478qb" event={"ID":"54b5eda2-7879-473b-b45d-e5ef3d128fa4","Type":"ContainerStarted","Data":"e65783790d4933c836c67b35bd2edce8ffbf949b50c09754fc13674b22b8253e"} Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.814735 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.822235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f31cc43a-6cff-4aec-8306-e6d0eb59f973","Type":"ContainerStarted","Data":"3832f1f99b0abe15b1ac789fc2e5efa4de0e38e7b5c1b92fdeddfd30cc77a3aa"} Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.827210 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff11c89-e09d-40a3-ac11-4eece14288c9-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.845196 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-4k5x8" podStartSLOduration=7.845176048 podStartE2EDuration="7.845176048s" podCreationTimestamp="2026-01-28 18:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:58.815564062 +0000 UTC m=+1466.827090827" watchObservedRunningTime="2026-01-28 18:59:58.845176048 +0000 UTC m=+1466.856702823" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.867205 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-55c2-account-create-update-478qb" podStartSLOduration=7.867184046 podStartE2EDuration="7.867184046s" podCreationTimestamp="2026-01-28 18:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 18:59:58.83476641 +0000 UTC m=+1466.846293195" watchObservedRunningTime="2026-01-28 18:59:58.867184046 +0000 UTC m=+1466.878710821" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.900972 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c312272-e82e-4cdc-8501-5b27b63a3ba0" path="/var/lib/kubelet/pods/6c312272-e82e-4cdc-8501-5b27b63a3ba0/volumes" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.936564 4749 scope.go:117] "RemoveContainer" containerID="880a7dc39a6a4a6ccdae1fd997c0bd3c477728c52d4b8bf60922ccbc63c7cd91" Jan 28 18:59:58 crc kubenswrapper[4749]: I0128 18:59:58.972975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.375266 4749 scope.go:117] "RemoveContainer" containerID="f742fe22d8ff1ea7a10f5755d2836f77ec6120570265207db51bb9533c093c5c" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.472854 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-9dfc74cb7-xtw7l"] Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.487032 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-9dfc74cb7-xtw7l"] Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.514845 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.526469 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-66cbb7db47-sh8m4"] Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.543254 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-66cbb7db47-sh8m4"] Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.543477 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.543521 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-694b5876b5-phgqj" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.669402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljqw6\" (UniqueName: \"kubernetes.io/projected/dfb40a95-9140-4421-8f68-ecc6870d903a-kube-api-access-ljqw6\") pod \"dfb40a95-9140-4421-8f68-ecc6870d903a\" (UID: \"dfb40a95-9140-4421-8f68-ecc6870d903a\") " Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.669495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfb40a95-9140-4421-8f68-ecc6870d903a-operator-scripts\") pod \"dfb40a95-9140-4421-8f68-ecc6870d903a\" (UID: \"dfb40a95-9140-4421-8f68-ecc6870d903a\") " Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.674488 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfb40a95-9140-4421-8f68-ecc6870d903a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfb40a95-9140-4421-8f68-ecc6870d903a" (UID: "dfb40a95-9140-4421-8f68-ecc6870d903a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.744617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb40a95-9140-4421-8f68-ecc6870d903a-kube-api-access-ljqw6" (OuterVolumeSpecName: "kube-api-access-ljqw6") pod "dfb40a95-9140-4421-8f68-ecc6870d903a" (UID: "dfb40a95-9140-4421-8f68-ecc6870d903a"). InnerVolumeSpecName "kube-api-access-ljqw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.773049 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljqw6\" (UniqueName: \"kubernetes.io/projected/dfb40a95-9140-4421-8f68-ecc6870d903a-kube-api-access-ljqw6\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.773083 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfb40a95-9140-4421-8f68-ecc6870d903a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.796121 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.877532 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffa815c9-d525-45d3-a179-dce588ffd65d-operator-scripts\") pod \"ffa815c9-d525-45d3-a179-dce588ffd65d\" (UID: \"ffa815c9-d525-45d3-a179-dce588ffd65d\") " Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.877630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/ffa815c9-d525-45d3-a179-dce588ffd65d-kube-api-access-522kq\") pod \"ffa815c9-d525-45d3-a179-dce588ffd65d\" (UID: \"ffa815c9-d525-45d3-a179-dce588ffd65d\") " Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.882237 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa815c9-d525-45d3-a179-dce588ffd65d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffa815c9-d525-45d3-a179-dce588ffd65d" (UID: "ffa815c9-d525-45d3-a179-dce588ffd65d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.897786 4749 generic.go:334] "Generic (PLEG): container finished" podID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerID="e0252959e510eb68239b1471cedb9131e129c7309d159863a0c309705ba775ac" exitCode=0 Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.897873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rdwp" event={"ID":"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1","Type":"ContainerDied","Data":"e0252959e510eb68239b1471cedb9131e129c7309d159863a0c309705ba775ac"} Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.901056 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa815c9-d525-45d3-a179-dce588ffd65d-kube-api-access-522kq" (OuterVolumeSpecName: "kube-api-access-522kq") pod "ffa815c9-d525-45d3-a179-dce588ffd65d" (UID: "ffa815c9-d525-45d3-a179-dce588ffd65d"). InnerVolumeSpecName "kube-api-access-522kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.918743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a39b5ea6-cb49-4c58-85c4-f9b274ec979b","Type":"ContainerStarted","Data":"f11cde68104a2a4b73d3e8afe779bfd8bb44e69915362836e6992dd232f1f94c"} Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.966165 4749 generic.go:334] "Generic (PLEG): container finished" podID="f53d974f-353d-49e8-9347-658cf61ed52b" containerID="d8413800bacc7001e486a90301643948645e44db627651c5e1d9f1588ec6c17c" exitCode=0 Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.966252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4k5x8" event={"ID":"f53d974f-353d-49e8-9347-658cf61ed52b","Type":"ContainerDied","Data":"d8413800bacc7001e486a90301643948645e44db627651c5e1d9f1588ec6c17c"} Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.980442 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffa815c9-d525-45d3-a179-dce588ffd65d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.980465 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/ffa815c9-d525-45d3-a179-dce588ffd65d-kube-api-access-522kq\") on node \"crc\" DevicePath \"\"" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.985607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" event={"ID":"dfb40a95-9140-4421-8f68-ecc6870d903a","Type":"ContainerDied","Data":"f63ac6f32ed42f2923c7f72e9b8445760a17f0b6e087a091cb85526bbf1b5ac5"} Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.985641 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f63ac6f32ed42f2923c7f72e9b8445760a17f0b6e087a091cb85526bbf1b5ac5" Jan 28 18:59:59 crc kubenswrapper[4749]: I0128 18:59:59.985713 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-68d1-account-create-update-6vdlq" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.002704 4749 generic.go:334] "Generic (PLEG): container finished" podID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerID="917d85169a5b197c14265a97b8d1a2cbf02b19faa3f23321a8339221122881d0" exitCode=0 Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.002759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c94bd564-mjrws" event={"ID":"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc","Type":"ContainerDied","Data":"917d85169a5b197c14265a97b8d1a2cbf02b19faa3f23321a8339221122881d0"} Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.006271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwgd9" event={"ID":"5b3182d1-16fe-490b-80d1-b0d3445cf1c8","Type":"ContainerStarted","Data":"aaef5ee0a92c08baea278b3664f232d458db46fe8ef4ec77605633d19da1d631"} Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.015031 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-54df-account-create-update-mnv5k" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.016169 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-54df-account-create-update-mnv5k" event={"ID":"ffa815c9-d525-45d3-a179-dce588ffd65d","Type":"ContainerDied","Data":"55b721ce53d5eedea3b64feba41626c2549d530b285364e4183e2a444b50800c"} Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.016205 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55b721ce53d5eedea3b64feba41626c2549d530b285364e4183e2a444b50800c" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.104260 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-wwgd9" podStartSLOduration=9.104229729 podStartE2EDuration="9.104229729s" podCreationTimestamp="2026-01-28 18:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:00.049495506 +0000 UTC m=+1468.061022281" watchObservedRunningTime="2026-01-28 19:00:00.104229729 +0000 UTC m=+1468.115756504" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.195396 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.216952 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8"] Jan 28 19:00:00 crc kubenswrapper[4749]: E0128 19:00:00.224208 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb40a95-9140-4421-8f68-ecc6870d903a" containerName="mariadb-account-create-update" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.224256 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb40a95-9140-4421-8f68-ecc6870d903a" containerName="mariadb-account-create-update" Jan 28 19:00:00 crc kubenswrapper[4749]: E0128 19:00:00.224288 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa815c9-d525-45d3-a179-dce588ffd65d" containerName="mariadb-account-create-update" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.224296 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa815c9-d525-45d3-a179-dce588ffd65d" containerName="mariadb-account-create-update" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.224618 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa815c9-d525-45d3-a179-dce588ffd65d" containerName="mariadb-account-create-update" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.224648 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb40a95-9140-4421-8f68-ecc6870d903a" containerName="mariadb-account-create-update" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.225788 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.229077 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.229228 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.231193 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8"] Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.317600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j94z\" (UniqueName: \"kubernetes.io/projected/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-kube-api-access-8j94z\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.317692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-config-volume\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.318050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-secret-volume\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.422127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-secret-volume\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.422459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j94z\" (UniqueName: \"kubernetes.io/projected/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-kube-api-access-8j94z\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.422513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-config-volume\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.423502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-config-volume\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.428565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-secret-volume\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.444254 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j94z\" (UniqueName: \"kubernetes.io/projected/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-kube-api-access-8j94z\") pod \"collect-profiles-29493780-277f8\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.552717 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.950156 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff11c89-e09d-40a3-ac11-4eece14288c9" path="/var/lib/kubelet/pods/3ff11c89-e09d-40a3-ac11-4eece14288c9/volumes" Jan 28 19:00:00 crc kubenswrapper[4749]: I0128 19:00:00.952044 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be080cb1-98cd-4790-8132-52818502d1a0" path="/var/lib/kubelet/pods/be080cb1-98cd-4790-8132-52818502d1a0/volumes" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.042000 4749 generic.go:334] "Generic (PLEG): container finished" podID="5b3182d1-16fe-490b-80d1-b0d3445cf1c8" containerID="aaef5ee0a92c08baea278b3664f232d458db46fe8ef4ec77605633d19da1d631" exitCode=0 Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.042066 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwgd9" event={"ID":"5b3182d1-16fe-490b-80d1-b0d3445cf1c8","Type":"ContainerDied","Data":"aaef5ee0a92c08baea278b3664f232d458db46fe8ef4ec77605633d19da1d631"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.053884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5667478cf5-vcwtt" event={"ID":"b7c300e5-3d03-481e-bd44-85d8f7d44e72","Type":"ContainerStarted","Data":"ecabafc106a4f5fb1245795f06acd72b5c22562b61b97b1e0e841bd557c9c18d"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.054033 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.083043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a39b5ea6-cb49-4c58-85c4-f9b274ec979b","Type":"ContainerStarted","Data":"e0598f56285e00b1ba56e7534354a15a6379b2f2c9e1b13fa97582dd33d75b71"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.104165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f31cc43a-6cff-4aec-8306-e6d0eb59f973","Type":"ContainerStarted","Data":"6ac21f16b2413529f7f1461b42249562c713b194f86f20edd896c96c00229c32"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.107963 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5667478cf5-vcwtt" podStartSLOduration=19.107941424 podStartE2EDuration="19.107941424s" podCreationTimestamp="2026-01-28 18:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:01.093682319 +0000 UTC m=+1469.105209094" watchObservedRunningTime="2026-01-28 19:00:01.107941424 +0000 UTC m=+1469.119468199" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.116296 4749 generic.go:334] "Generic (PLEG): container finished" podID="7fdd109b-56e0-4cf0-a763-8573984ee415" containerID="cf429ea6258525fe1bedb4b48ad0d9a5bbaa3c21ed32002c680f52f446407cf5" exitCode=1 Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.116442 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-778d479c94-bcw7x" event={"ID":"7fdd109b-56e0-4cf0-a763-8573984ee415","Type":"ContainerDied","Data":"cf429ea6258525fe1bedb4b48ad0d9a5bbaa3c21ed32002c680f52f446407cf5"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.117161 4749 scope.go:117] "RemoveContainer" containerID="cf429ea6258525fe1bedb4b48ad0d9a5bbaa3c21ed32002c680f52f446407cf5" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.147835 4749 generic.go:334] "Generic (PLEG): container finished" podID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" containerID="f69697977a4b5805c6f143377b828d3af9f5fb67bada8f02d2bf03ee8efb3a4a" exitCode=1 Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.149202 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" event={"ID":"b6dba594-df49-48ca-9cdf-5ccb4a76d74c","Type":"ContainerDied","Data":"f69697977a4b5805c6f143377b828d3af9f5fb67bada8f02d2bf03ee8efb3a4a"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.149824 4749 scope.go:117] "RemoveContainer" containerID="f69697977a4b5805c6f143377b828d3af9f5fb67bada8f02d2bf03ee8efb3a4a" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.156021 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" event={"ID":"df71ee0d-ef89-4e80-807c-2751810bca99","Type":"ContainerStarted","Data":"fa630c201f5d9e6b20d5ab301438a6d24afb2347d56cd6fedff144a8fc1d2083"} Jan 28 19:00:01 crc kubenswrapper[4749]: E0128 19:00:01.156226 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-74bcff4f86-hqml8_openstack(b6dba594-df49-48ca-9cdf-5ccb4a76d74c)\"" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.156578 4749 scope.go:117] "RemoveContainer" containerID="b77987f8cf360347e2724320def6443d616088d3e1fc0398dc8804537dfd5912" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.156998 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.176868 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0f8e837-ecac-47c7-9a8e-204d9bbbc42b" containerID="5228904e8bfc7e1c49573772addcd5454552ea07d2fbe373ba8c93d6be1c6dee" exitCode=0 Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.176991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6czqc" event={"ID":"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b","Type":"ContainerDied","Data":"5228904e8bfc7e1c49573772addcd5454552ea07d2fbe373ba8c93d6be1c6dee"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.195732 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"35c07eea-b699-4b39-b6db-ad0a9536ebe4","Type":"ContainerStarted","Data":"ecb8faa10dd9c4ce71d472909a6262cccf84ec73d8b46d7b153f0828879fb548"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.202850 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-758976cb66-pthtk" event={"ID":"8a13bcdb-3c7e-487b-90a5-5f794941eb5d","Type":"ContainerStarted","Data":"12e26de7f7737c44f1f86fddcc0555ce6ffab0f6f8b2440e21420f481df1288c"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.203198 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-758976cb66-pthtk" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.216830 4749 generic.go:334] "Generic (PLEG): container finished" podID="54b5eda2-7879-473b-b45d-e5ef3d128fa4" containerID="e65783790d4933c836c67b35bd2edce8ffbf949b50c09754fc13674b22b8253e" exitCode=0 Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.217146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-55c2-account-create-update-478qb" event={"ID":"54b5eda2-7879-473b-b45d-e5ef3d128fa4","Type":"ContainerDied","Data":"e65783790d4933c836c67b35bd2edce8ffbf949b50c09754fc13674b22b8253e"} Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.285600 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" podStartSLOduration=14.285581155 podStartE2EDuration="14.285581155s" podCreationTimestamp="2026-01-28 18:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:01.198773444 +0000 UTC m=+1469.210300239" watchObservedRunningTime="2026-01-28 19:00:01.285581155 +0000 UTC m=+1469.297107930" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.313777 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-758976cb66-pthtk" podStartSLOduration=14.313753465 podStartE2EDuration="14.313753465s" podCreationTimestamp="2026-01-28 18:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:01.225984902 +0000 UTC m=+1469.237511687" watchObservedRunningTime="2026-01-28 19:00:01.313753465 +0000 UTC m=+1469.325280240" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.455873 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c94bd564-mjrws" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.515852 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-combined-ca-bundle\") pod \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.515924 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz6jk\" (UniqueName: \"kubernetes.io/projected/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-kube-api-access-qz6jk\") pod \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.516096 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-httpd-config\") pod \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.516131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-config\") pod \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.516254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-ovndb-tls-certs\") pod \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\" (UID: \"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc\") " Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.532947 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" (UID: "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.534099 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-kube-api-access-qz6jk" (OuterVolumeSpecName: "kube-api-access-qz6jk") pod "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" (UID: "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc"). InnerVolumeSpecName "kube-api-access-qz6jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.567867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8"] Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.621254 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.621297 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz6jk\" (UniqueName: \"kubernetes.io/projected/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-kube-api-access-qz6jk\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.647194 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-config" (OuterVolumeSpecName: "config") pod "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" (UID: "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.723562 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-config\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.778530 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" (UID: "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.813974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" (UID: "47edfaf7-6dbb-4e01-9580-9a329c9b8bbc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.826178 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:01 crc kubenswrapper[4749]: I0128 19:00:01.826209 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.244622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" event={"ID":"6666ce12-edde-4ccc-ad28-deb0f7c7ae25","Type":"ContainerStarted","Data":"116e69a4f5c782e407db5f3cc0475a31a9f3b75ecf60f2d92006af0ad118ebb2"} Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.339584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rdwp" event={"ID":"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1","Type":"ContainerStarted","Data":"9c55ae714bddbfc9a23a95dfd2b1d4fcf2eadf9d0b585b1fb4241ff1dbea1850"} Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.341112 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.362193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"35c07eea-b699-4b39-b6db-ad0a9536ebe4","Type":"ContainerStarted","Data":"a689e3a3cd94126164eaa0d1f80e75c309b024a6099643a83d32d5a84302a135"} Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.367165 4749 generic.go:334] "Generic (PLEG): container finished" podID="7fdd109b-56e0-4cf0-a763-8573984ee415" containerID="a547012dc06af190d0c75f964a7bbf2bf2404fd99297a7b3df8c715c87d4861f" exitCode=1 Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.367251 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-778d479c94-bcw7x" event={"ID":"7fdd109b-56e0-4cf0-a763-8573984ee415","Type":"ContainerDied","Data":"a547012dc06af190d0c75f964a7bbf2bf2404fd99297a7b3df8c715c87d4861f"} Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.367438 4749 scope.go:117] "RemoveContainer" containerID="cf429ea6258525fe1bedb4b48ad0d9a5bbaa3c21ed32002c680f52f446407cf5" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.368728 4749 scope.go:117] "RemoveContainer" containerID="a547012dc06af190d0c75f964a7bbf2bf2404fd99297a7b3df8c715c87d4861f" Jan 28 19:00:02 crc kubenswrapper[4749]: E0128 19:00:02.369182 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-778d479c94-bcw7x_openstack(7fdd109b-56e0-4cf0-a763-8573984ee415)\"" pod="openstack/heat-api-778d479c94-bcw7x" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.389469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4k5x8" event={"ID":"f53d974f-353d-49e8-9347-658cf61ed52b","Type":"ContainerDied","Data":"3d9e5be952ff8d7415bd3ca75394b4de51112f7bdf43b7f266cd7fb85a34af52"} Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.389529 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9e5be952ff8d7415bd3ca75394b4de51112f7bdf43b7f266cd7fb85a34af52" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.389594 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4k5x8" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.413926 4749 scope.go:117] "RemoveContainer" containerID="f69697977a4b5805c6f143377b828d3af9f5fb67bada8f02d2bf03ee8efb3a4a" Jan 28 19:00:02 crc kubenswrapper[4749]: E0128 19:00:02.414209 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-74bcff4f86-hqml8_openstack(b6dba594-df49-48ca-9cdf-5ccb4a76d74c)\"" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.441473 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53d974f-353d-49e8-9347-658cf61ed52b-operator-scripts\") pod \"f53d974f-353d-49e8-9347-658cf61ed52b\" (UID: \"f53d974f-353d-49e8-9347-658cf61ed52b\") " Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.441586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75b7h\" (UniqueName: \"kubernetes.io/projected/f53d974f-353d-49e8-9347-658cf61ed52b-kube-api-access-75b7h\") pod \"f53d974f-353d-49e8-9347-658cf61ed52b\" (UID: \"f53d974f-353d-49e8-9347-658cf61ed52b\") " Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.442160 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53d974f-353d-49e8-9347-658cf61ed52b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f53d974f-353d-49e8-9347-658cf61ed52b" (UID: "f53d974f-353d-49e8-9347-658cf61ed52b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.442761 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53d974f-353d-49e8-9347-658cf61ed52b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.461664 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c94bd564-mjrws" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.462516 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c94bd564-mjrws" event={"ID":"47edfaf7-6dbb-4e01-9580-9a329c9b8bbc","Type":"ContainerDied","Data":"f9aab673c56d11f6f91bb1de34d8eb0651d6371c92972576605608872ef30eb2"} Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.470551 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53d974f-353d-49e8-9347-658cf61ed52b-kube-api-access-75b7h" (OuterVolumeSpecName: "kube-api-access-75b7h") pod "f53d974f-353d-49e8-9347-658cf61ed52b" (UID: "f53d974f-353d-49e8-9347-658cf61ed52b"). InnerVolumeSpecName "kube-api-access-75b7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.545871 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75b7h\" (UniqueName: \"kubernetes.io/projected/f53d974f-353d-49e8-9347-658cf61ed52b-kube-api-access-75b7h\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.691993 4749 scope.go:117] "RemoveContainer" containerID="076ee848881ff4c5f111de5abc89f0713e2834b80064fbd9e09e8fade6b522bc" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.699358 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67c94bd564-mjrws"] Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.717155 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67c94bd564-mjrws"] Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.759791 4749 scope.go:117] "RemoveContainer" containerID="917d85169a5b197c14265a97b8d1a2cbf02b19faa3f23321a8339221122881d0" Jan 28 19:00:02 crc kubenswrapper[4749]: I0128 19:00:02.889576 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" path="/var/lib/kubelet/pods/47edfaf7-6dbb-4e01-9580-9a329c9b8bbc/volumes" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.281696 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.369514 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.369572 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.379222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b5eda2-7879-473b-b45d-e5ef3d128fa4-operator-scripts\") pod \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\" (UID: \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\") " Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.379481 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7rms\" (UniqueName: \"kubernetes.io/projected/54b5eda2-7879-473b-b45d-e5ef3d128fa4-kube-api-access-z7rms\") pod \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\" (UID: \"54b5eda2-7879-473b-b45d-e5ef3d128fa4\") " Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.381040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b5eda2-7879-473b-b45d-e5ef3d128fa4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54b5eda2-7879-473b-b45d-e5ef3d128fa4" (UID: "54b5eda2-7879-473b-b45d-e5ef3d128fa4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.457794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b5eda2-7879-473b-b45d-e5ef3d128fa4-kube-api-access-z7rms" (OuterVolumeSpecName: "kube-api-access-z7rms") pod "54b5eda2-7879-473b-b45d-e5ef3d128fa4" (UID: "54b5eda2-7879-473b-b45d-e5ef3d128fa4"). InnerVolumeSpecName "kube-api-access-z7rms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.480106 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.480161 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.486901 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b5eda2-7879-473b-b45d-e5ef3d128fa4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.486945 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7rms\" (UniqueName: \"kubernetes.io/projected/54b5eda2-7879-473b-b45d-e5ef3d128fa4-kube-api-access-z7rms\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.526839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6czqc" event={"ID":"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b","Type":"ContainerDied","Data":"efa69aeb8a14dbf4f25a01d0ca268744e32cf9b4ab249ec468dd64bcdc52b60f"} Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.526879 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efa69aeb8a14dbf4f25a01d0ca268744e32cf9b4ab249ec468dd64bcdc52b60f" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.576421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f31cc43a-6cff-4aec-8306-e6d0eb59f973","Type":"ContainerStarted","Data":"cb18a50a61abb16993970b6df8f4e219cfca68e61469de0c38913980c1ffd2b8"} Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.595050 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6czqc" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.639534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"35c07eea-b699-4b39-b6db-ad0a9536ebe4","Type":"ContainerStarted","Data":"a46b6dfe6a6860e931614353aaabb260e93a74c9a381c576286aeb41d72cd07e"} Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.709965 4749 scope.go:117] "RemoveContainer" containerID="a547012dc06af190d0c75f964a7bbf2bf2404fd99297a7b3df8c715c87d4861f" Jan 28 19:00:03 crc kubenswrapper[4749]: E0128 19:00:03.710503 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-778d479c94-bcw7x_openstack(7fdd109b-56e0-4cf0-a763-8573984ee415)\"" pod="openstack/heat-api-778d479c94-bcw7x" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.766594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-55c2-account-create-update-478qb" event={"ID":"54b5eda2-7879-473b-b45d-e5ef3d128fa4","Type":"ContainerDied","Data":"2ebbd860d9db2a224975ff5290ff9e8066b8537386b8c9948e7b545c8f6aad71"} Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.766639 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ebbd860d9db2a224975ff5290ff9e8066b8537386b8c9948e7b545c8f6aad71" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.766957 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-55c2-account-create-update-478qb" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.767615 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.777473 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.777441571 podStartE2EDuration="8.777441571s" podCreationTimestamp="2026-01-28 18:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:03.628648279 +0000 UTC m=+1471.640175054" watchObservedRunningTime="2026-01-28 19:00:03.777441571 +0000 UTC m=+1471.788968346" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.786989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" event={"ID":"6666ce12-edde-4ccc-ad28-deb0f7c7ae25","Type":"ContainerStarted","Data":"606c259c4aac7dfaf23dd9be63d3771fa6817f3a4ba28d749d36e7d7fe7465ca"} Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.790113 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.790068796 podStartE2EDuration="6.790068796s" podCreationTimestamp="2026-01-28 18:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:03.719065938 +0000 UTC m=+1471.730592733" watchObservedRunningTime="2026-01-28 19:00:03.790068796 +0000 UTC m=+1471.801595571" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.812221 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-operator-scripts\") pod \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\" (UID: \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\") " Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.813790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0f8e837-ecac-47c7-9a8e-204d9bbbc42b" (UID: "e0f8e837-ecac-47c7-9a8e-204d9bbbc42b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.815049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwgd9" event={"ID":"5b3182d1-16fe-490b-80d1-b0d3445cf1c8","Type":"ContainerDied","Data":"58e50dc72e533f73486339212ac2c8114d9e0f36c8f6014cff97bafafea5768f"} Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.815094 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e50dc72e533f73486339212ac2c8114d9e0f36c8f6014cff97bafafea5768f" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.815317 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwgd9" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.817848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22s59\" (UniqueName: \"kubernetes.io/projected/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-kube-api-access-22s59\") pod \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\" (UID: \"e0f8e837-ecac-47c7-9a8e-204d9bbbc42b\") " Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.819804 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.832228 4749 generic.go:334] "Generic (PLEG): container finished" podID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerID="9c55ae714bddbfc9a23a95dfd2b1d4fcf2eadf9d0b585b1fb4241ff1dbea1850" exitCode=0 Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.832562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rdwp" event={"ID":"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1","Type":"ContainerDied","Data":"9c55ae714bddbfc9a23a95dfd2b1d4fcf2eadf9d0b585b1fb4241ff1dbea1850"} Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.835249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-kube-api-access-22s59" (OuterVolumeSpecName: "kube-api-access-22s59") pod "e0f8e837-ecac-47c7-9a8e-204d9bbbc42b" (UID: "e0f8e837-ecac-47c7-9a8e-204d9bbbc42b"). InnerVolumeSpecName "kube-api-access-22s59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.865211 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a39b5ea6-cb49-4c58-85c4-f9b274ec979b","Type":"ContainerStarted","Data":"ea46318c5aec9123e11fd015f1289611fe6dea2bca912c4b82d62ef006492f2e"} Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.866815 4749 scope.go:117] "RemoveContainer" containerID="f69697977a4b5805c6f143377b828d3af9f5fb67bada8f02d2bf03ee8efb3a4a" Jan 28 19:00:03 crc kubenswrapper[4749]: E0128 19:00:03.867428 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-74bcff4f86-hqml8_openstack(b6dba594-df49-48ca-9cdf-5ccb4a76d74c)\"" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.936001 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnqr7\" (UniqueName: \"kubernetes.io/projected/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-kube-api-access-dnqr7\") pod \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\" (UID: \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\") " Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.936049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-operator-scripts\") pod \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\" (UID: \"5b3182d1-16fe-490b-80d1-b0d3445cf1c8\") " Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.936871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b3182d1-16fe-490b-80d1-b0d3445cf1c8" (UID: "5b3182d1-16fe-490b-80d1-b0d3445cf1c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.950917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-kube-api-access-dnqr7" (OuterVolumeSpecName: "kube-api-access-dnqr7") pod "5b3182d1-16fe-490b-80d1-b0d3445cf1c8" (UID: "5b3182d1-16fe-490b-80d1-b0d3445cf1c8"). InnerVolumeSpecName "kube-api-access-dnqr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.973493 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnqr7\" (UniqueName: \"kubernetes.io/projected/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-kube-api-access-dnqr7\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.973522 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3182d1-16fe-490b-80d1-b0d3445cf1c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.973532 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22s59\" (UniqueName: \"kubernetes.io/projected/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b-kube-api-access-22s59\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:03 crc kubenswrapper[4749]: I0128 19:00:03.974244 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.974225329 podStartE2EDuration="7.974225329s" podCreationTimestamp="2026-01-28 18:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:03.894602107 +0000 UTC m=+1471.906128892" watchObservedRunningTime="2026-01-28 19:00:03.974225329 +0000 UTC m=+1471.985752104" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.890441 4749 generic.go:334] "Generic (PLEG): container finished" podID="6666ce12-edde-4ccc-ad28-deb0f7c7ae25" containerID="606c259c4aac7dfaf23dd9be63d3771fa6817f3a4ba28d749d36e7d7fe7465ca" exitCode=0 Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.890846 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6czqc" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.894192 4749 scope.go:117] "RemoveContainer" containerID="a547012dc06af190d0c75f964a7bbf2bf2404fd99297a7b3df8c715c87d4861f" Jan 28 19:00:04 crc kubenswrapper[4749]: E0128 19:00:04.894564 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-778d479c94-bcw7x_openstack(7fdd109b-56e0-4cf0-a763-8573984ee415)\"" pod="openstack/heat-api-778d479c94-bcw7x" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.894945 4749 scope.go:117] "RemoveContainer" containerID="f69697977a4b5805c6f143377b828d3af9f5fb67bada8f02d2bf03ee8efb3a4a" Jan 28 19:00:04 crc kubenswrapper[4749]: E0128 19:00:04.895275 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-74bcff4f86-hqml8_openstack(b6dba594-df49-48ca-9cdf-5ccb4a76d74c)\"" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.910470 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b5nqn"] Jan 28 19:00:04 crc kubenswrapper[4749]: E0128 19:00:04.910853 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerName="neutron-api" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.910869 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerName="neutron-api" Jan 28 19:00:04 crc kubenswrapper[4749]: E0128 19:00:04.910888 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b5eda2-7879-473b-b45d-e5ef3d128fa4" containerName="mariadb-account-create-update" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.910895 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b5eda2-7879-473b-b45d-e5ef3d128fa4" containerName="mariadb-account-create-update" Jan 28 19:00:04 crc kubenswrapper[4749]: E0128 19:00:04.910907 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f8e837-ecac-47c7-9a8e-204d9bbbc42b" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.910912 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f8e837-ecac-47c7-9a8e-204d9bbbc42b" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: E0128 19:00:04.910944 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerName="neutron-httpd" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.910949 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerName="neutron-httpd" Jan 28 19:00:04 crc kubenswrapper[4749]: E0128 19:00:04.910962 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3182d1-16fe-490b-80d1-b0d3445cf1c8" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.910968 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3182d1-16fe-490b-80d1-b0d3445cf1c8" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: E0128 19:00:04.910981 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53d974f-353d-49e8-9347-658cf61ed52b" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.910987 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53d974f-353d-49e8-9347-658cf61ed52b" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.911199 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f8e837-ecac-47c7-9a8e-204d9bbbc42b" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.911220 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerName="neutron-httpd" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.911232 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="47edfaf7-6dbb-4e01-9580-9a329c9b8bbc" containerName="neutron-api" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.911240 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53d974f-353d-49e8-9347-658cf61ed52b" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.911256 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3182d1-16fe-490b-80d1-b0d3445cf1c8" containerName="mariadb-database-create" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.911263 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b5eda2-7879-473b-b45d-e5ef3d128fa4" containerName="mariadb-account-create-update" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.913455 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5nqn"] Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.914053 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rdwp" event={"ID":"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1","Type":"ContainerStarted","Data":"6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537"} Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.914087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" event={"ID":"6666ce12-edde-4ccc-ad28-deb0f7c7ae25","Type":"ContainerDied","Data":"606c259c4aac7dfaf23dd9be63d3771fa6817f3a4ba28d749d36e7d7fe7465ca"} Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.913787 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:04 crc kubenswrapper[4749]: I0128 19:00:04.941948 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4rdwp" podStartSLOduration=20.173804593 podStartE2EDuration="23.941925268s" podCreationTimestamp="2026-01-28 18:59:41 +0000 UTC" firstStartedPulling="2026-01-28 18:59:59.905106763 +0000 UTC m=+1467.916633538" lastFinishedPulling="2026-01-28 19:00:03.673227438 +0000 UTC m=+1471.684754213" observedRunningTime="2026-01-28 19:00:04.928119315 +0000 UTC m=+1472.939646110" watchObservedRunningTime="2026-01-28 19:00:04.941925268 +0000 UTC m=+1472.953452043" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.003841 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4r4v\" (UniqueName: \"kubernetes.io/projected/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-kube-api-access-p4r4v\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.004310 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-utilities\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.005577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-catalog-content\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.107888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-catalog-content\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.108040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4r4v\" (UniqueName: \"kubernetes.io/projected/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-kube-api-access-p4r4v\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.108112 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-utilities\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.108623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-catalog-content\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.108749 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-utilities\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.153599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4r4v\" (UniqueName: \"kubernetes.io/projected/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-kube-api-access-p4r4v\") pod \"community-operators-b5nqn\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.266979 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.425462 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.540734 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j94z\" (UniqueName: \"kubernetes.io/projected/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-kube-api-access-8j94z\") pod \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.540852 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-secret-volume\") pod \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.540899 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-config-volume\") pod \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\" (UID: \"6666ce12-edde-4ccc-ad28-deb0f7c7ae25\") " Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.542039 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-config-volume" (OuterVolumeSpecName: "config-volume") pod "6666ce12-edde-4ccc-ad28-deb0f7c7ae25" (UID: "6666ce12-edde-4ccc-ad28-deb0f7c7ae25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.542686 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.555614 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-kube-api-access-8j94z" (OuterVolumeSpecName: "kube-api-access-8j94z") pod "6666ce12-edde-4ccc-ad28-deb0f7c7ae25" (UID: "6666ce12-edde-4ccc-ad28-deb0f7c7ae25"). InnerVolumeSpecName "kube-api-access-8j94z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.557431 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6666ce12-edde-4ccc-ad28-deb0f7c7ae25" (UID: "6666ce12-edde-4ccc-ad28-deb0f7c7ae25"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.645568 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j94z\" (UniqueName: \"kubernetes.io/projected/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-kube-api-access-8j94z\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.645609 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6666ce12-edde-4ccc-ad28-deb0f7c7ae25-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.839869 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5nqn"] Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.909083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" event={"ID":"6666ce12-edde-4ccc-ad28-deb0f7c7ae25","Type":"ContainerDied","Data":"116e69a4f5c782e407db5f3cc0475a31a9f3b75ecf60f2d92006af0ad118ebb2"} Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.909383 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="116e69a4f5c782e407db5f3cc0475a31a9f3b75ecf60f2d92006af0ad118ebb2" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.909113 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8" Jan 28 19:00:05 crc kubenswrapper[4749]: I0128 19:00:05.925220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5nqn" event={"ID":"7aa7fd2d-ff29-4928-9e51-54ce06ec0028","Type":"ContainerStarted","Data":"1752c6305e2258b5d2282a29b405dea823a8c8600235132dc8af301d4a4face2"} Jan 28 19:00:06 crc kubenswrapper[4749]: I0128 19:00:06.476651 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 28 19:00:06 crc kubenswrapper[4749]: I0128 19:00:06.700008 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 28 19:00:06 crc kubenswrapper[4749]: I0128 19:00:06.916072 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:06 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:06 crc kubenswrapper[4749]: > Jan 28 19:00:06 crc kubenswrapper[4749]: I0128 19:00:06.937917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5nqn" event={"ID":"7aa7fd2d-ff29-4928-9e51-54ce06ec0028","Type":"ContainerStarted","Data":"3b703fc79c1ddb2b535b71176efde7fc0fc40d0f4e617872bc1dbb8fc50d7c4c"} Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.188115 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pnwds"] Jan 28 19:00:07 crc kubenswrapper[4749]: E0128 19:00:07.188678 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6666ce12-edde-4ccc-ad28-deb0f7c7ae25" containerName="collect-profiles" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.188694 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6666ce12-edde-4ccc-ad28-deb0f7c7ae25" containerName="collect-profiles" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.188954 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6666ce12-edde-4ccc-ad28-deb0f7c7ae25" containerName="collect-profiles" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.189749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.192667 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.193355 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.193802 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-stslj" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.210571 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pnwds"] Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.282140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-config-data\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.282625 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5srf\" (UniqueName: \"kubernetes.io/projected/d68f84a9-069c-4b25-a939-cd98ba9ab12b-kube-api-access-m5srf\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.282765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.283038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-scripts\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.386938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5srf\" (UniqueName: \"kubernetes.io/projected/d68f84a9-069c-4b25-a939-cd98ba9ab12b-kube-api-access-m5srf\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.387281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.387392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-scripts\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.387461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-config-data\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.394509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-config-data\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.395103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-scripts\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.395604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.404992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5srf\" (UniqueName: \"kubernetes.io/projected/d68f84a9-069c-4b25-a939-cd98ba9ab12b-kube-api-access-m5srf\") pod \"nova-cell0-conductor-db-sync-pnwds\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.555771 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.637034 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.637090 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.730922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.771783 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.951058 4749 generic.go:334] "Generic (PLEG): container finished" podID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerID="3b703fc79c1ddb2b535b71176efde7fc0fc40d0f4e617872bc1dbb8fc50d7c4c" exitCode=0 Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.951208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5nqn" event={"ID":"7aa7fd2d-ff29-4928-9e51-54ce06ec0028","Type":"ContainerDied","Data":"3b703fc79c1ddb2b535b71176efde7fc0fc40d0f4e617872bc1dbb8fc50d7c4c"} Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.953526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1b70372a-0994-4bb9-8369-7b00699ee7c0","Type":"ContainerStarted","Data":"190305bfaafa5e4994374da7893ebbf461042f38671abb0237aea195429a2cfb"} Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.954312 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.954533 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 19:00:07 crc kubenswrapper[4749]: I0128 19:00:07.997651 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.45938345 podStartE2EDuration="44.997632545s" podCreationTimestamp="2026-01-28 18:59:23 +0000 UTC" firstStartedPulling="2026-01-28 18:59:24.830303416 +0000 UTC m=+1432.841830191" lastFinishedPulling="2026-01-28 19:00:07.368552491 +0000 UTC m=+1475.380079286" observedRunningTime="2026-01-28 19:00:07.997545353 +0000 UTC m=+1476.009072128" watchObservedRunningTime="2026-01-28 19:00:07.997632545 +0000 UTC m=+1476.009159320" Jan 28 19:00:08 crc kubenswrapper[4749]: I0128 19:00:08.192832 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pnwds"] Jan 28 19:00:08 crc kubenswrapper[4749]: W0128 19:00:08.198370 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd68f84a9_069c_4b25_a939_cd98ba9ab12b.slice/crio-e6cc7a99716f35cb21bc930ffba1a728e078a38e1db66a0b4a645b7941354933 WatchSource:0}: Error finding container e6cc7a99716f35cb21bc930ffba1a728e078a38e1db66a0b4a645b7941354933: Status 404 returned error can't find the container with id e6cc7a99716f35cb21bc930ffba1a728e078a38e1db66a0b4a645b7941354933 Jan 28 19:00:08 crc kubenswrapper[4749]: I0128 19:00:08.973540 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 19:00:08 crc kubenswrapper[4749]: I0128 19:00:08.973869 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 19:00:08 crc kubenswrapper[4749]: I0128 19:00:08.982925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pnwds" event={"ID":"d68f84a9-069c-4b25-a939-cd98ba9ab12b","Type":"ContainerStarted","Data":"e6cc7a99716f35cb21bc930ffba1a728e078a38e1db66a0b4a645b7941354933"} Jan 28 19:00:09 crc kubenswrapper[4749]: I0128 19:00:09.035654 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 19:00:09 crc kubenswrapper[4749]: I0128 19:00:09.039738 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.002648 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.002919 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.002636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5nqn" event={"ID":"7aa7fd2d-ff29-4928-9e51-54ce06ec0028","Type":"ContainerStarted","Data":"eac3f4818b2555bbe2fb0ee7192790f4d7c1d60f9371f257a70952a42840ce08"} Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.003020 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.003061 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.524809 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7b886dbf44-fw2gl" Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.608542 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-74bcff4f86-hqml8"] Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.711634 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-758976cb66-pthtk" Jan 28 19:00:10 crc kubenswrapper[4749]: I0128 19:00:10.780988 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-778d479c94-bcw7x"] Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.638940 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.650687 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.737632 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.737682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.738049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbqn6\" (UniqueName: \"kubernetes.io/projected/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-kube-api-access-tbqn6\") pod \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.738253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-combined-ca-bundle\") pod \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.738416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data\") pod \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.738519 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data-custom\") pod \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\" (UID: \"b6dba594-df49-48ca-9cdf-5ccb4a76d74c\") " Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.744227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-kube-api-access-tbqn6" (OuterVolumeSpecName: "kube-api-access-tbqn6") pod "b6dba594-df49-48ca-9cdf-5ccb4a76d74c" (UID: "b6dba594-df49-48ca-9cdf-5ccb4a76d74c"). InnerVolumeSpecName "kube-api-access-tbqn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.769461 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6dba594-df49-48ca-9cdf-5ccb4a76d74c" (UID: "b6dba594-df49-48ca-9cdf-5ccb4a76d74c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.776449 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6dba594-df49-48ca-9cdf-5ccb4a76d74c" (UID: "b6dba594-df49-48ca-9cdf-5ccb4a76d74c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.836425 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data" (OuterVolumeSpecName: "config-data") pod "b6dba594-df49-48ca-9cdf-5ccb4a76d74c" (UID: "b6dba594-df49-48ca-9cdf-5ccb4a76d74c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.841454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data-custom\") pod \"7fdd109b-56e0-4cf0-a763-8573984ee415\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.841518 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-combined-ca-bundle\") pod \"7fdd109b-56e0-4cf0-a763-8573984ee415\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.841615 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmn5p\" (UniqueName: \"kubernetes.io/projected/7fdd109b-56e0-4cf0-a763-8573984ee415-kube-api-access-dmn5p\") pod \"7fdd109b-56e0-4cf0-a763-8573984ee415\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.842204 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data\") pod \"7fdd109b-56e0-4cf0-a763-8573984ee415\" (UID: \"7fdd109b-56e0-4cf0-a763-8573984ee415\") " Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.843362 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.843382 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbqn6\" (UniqueName: \"kubernetes.io/projected/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-kube-api-access-tbqn6\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.843393 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.843402 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dba594-df49-48ca-9cdf-5ccb4a76d74c-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.845801 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fdd109b-56e0-4cf0-a763-8573984ee415-kube-api-access-dmn5p" (OuterVolumeSpecName: "kube-api-access-dmn5p") pod "7fdd109b-56e0-4cf0-a763-8573984ee415" (UID: "7fdd109b-56e0-4cf0-a763-8573984ee415"). InnerVolumeSpecName "kube-api-access-dmn5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.846540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7fdd109b-56e0-4cf0-a763-8573984ee415" (UID: "7fdd109b-56e0-4cf0-a763-8573984ee415"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.855639 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.936411 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fdd109b-56e0-4cf0-a763-8573984ee415" (UID: "7fdd109b-56e0-4cf0-a763-8573984ee415"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.945238 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.945291 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.945301 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmn5p\" (UniqueName: \"kubernetes.io/projected/7fdd109b-56e0-4cf0-a763-8573984ee415-kube-api-access-dmn5p\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:11 crc kubenswrapper[4749]: I0128 19:00:11.972848 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data" (OuterVolumeSpecName: "config-data") pod "7fdd109b-56e0-4cf0-a763-8573984ee415" (UID: "7fdd109b-56e0-4cf0-a763-8573984ee415"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.047107 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd109b-56e0-4cf0-a763-8573984ee415-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.069616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-778d479c94-bcw7x" event={"ID":"7fdd109b-56e0-4cf0-a763-8573984ee415","Type":"ContainerDied","Data":"9d53c2845385117874295acf2b165e3550e8acb1fbfc54fc24ce15ac18ae3214"} Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.069665 4749 scope.go:117] "RemoveContainer" containerID="a547012dc06af190d0c75f964a7bbf2bf2404fd99297a7b3df8c715c87d4861f" Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.069776 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-778d479c94-bcw7x" Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.096073 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.096410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74bcff4f86-hqml8" event={"ID":"b6dba594-df49-48ca-9cdf-5ccb4a76d74c","Type":"ContainerDied","Data":"de2191ebb312e58cd3c7f1281361ac7fd8ebdd7276481b7dc030961dccf16524"} Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.139863 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-778d479c94-bcw7x"] Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.173401 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-778d479c94-bcw7x"] Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.193423 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.213549 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-74bcff4f86-hqml8"] Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.223606 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-74bcff4f86-hqml8"] Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.230166 4749 scope.go:117] "RemoveContainer" containerID="f69697977a4b5805c6f143377b828d3af9f5fb67bada8f02d2bf03ee8efb3a4a" Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.888652 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" path="/var/lib/kubelet/pods/7fdd109b-56e0-4cf0-a763-8573984ee415/volumes" Jan 28 19:00:12 crc kubenswrapper[4749]: I0128 19:00:12.889810 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" path="/var/lib/kubelet/pods/b6dba594-df49-48ca-9cdf-5ccb4a76d74c/volumes" Jan 28 19:00:13 crc kubenswrapper[4749]: I0128 19:00:13.144135 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5667478cf5-vcwtt" Jan 28 19:00:13 crc kubenswrapper[4749]: I0128 19:00:13.224110 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rdwp"] Jan 28 19:00:13 crc kubenswrapper[4749]: I0128 19:00:13.302123 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-568fc79444-j9z48"] Jan 28 19:00:13 crc kubenswrapper[4749]: I0128 19:00:13.302392 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-568fc79444-j9z48" podUID="47a65467-bac6-4a3c-b634-a3cbe9d282f3" containerName="heat-engine" containerID="cri-o://2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8" gracePeriod=60 Jan 28 19:00:14 crc kubenswrapper[4749]: I0128 19:00:14.135158 4749 generic.go:334] "Generic (PLEG): container finished" podID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerID="eac3f4818b2555bbe2fb0ee7192790f4d7c1d60f9371f257a70952a42840ce08" exitCode=0 Jan 28 19:00:14 crc kubenswrapper[4749]: I0128 19:00:14.135647 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4rdwp" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="registry-server" containerID="cri-o://6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537" gracePeriod=2 Jan 28 19:00:14 crc kubenswrapper[4749]: I0128 19:00:14.135207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5nqn" event={"ID":"7aa7fd2d-ff29-4928-9e51-54ce06ec0028","Type":"ContainerDied","Data":"eac3f4818b2555bbe2fb0ee7192790f4d7c1d60f9371f257a70952a42840ce08"} Jan 28 19:00:15 crc kubenswrapper[4749]: I0128 19:00:15.160476 4749 generic.go:334] "Generic (PLEG): container finished" podID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerID="6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537" exitCode=0 Jan 28 19:00:15 crc kubenswrapper[4749]: I0128 19:00:15.160549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rdwp" event={"ID":"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1","Type":"ContainerDied","Data":"6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537"} Jan 28 19:00:15 crc kubenswrapper[4749]: I0128 19:00:15.976636 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 19:00:15 crc kubenswrapper[4749]: I0128 19:00:15.976973 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 19:00:15 crc kubenswrapper[4749]: I0128 19:00:15.981523 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 19:00:15 crc kubenswrapper[4749]: I0128 19:00:15.981653 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 19:00:15 crc kubenswrapper[4749]: I0128 19:00:15.986494 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.636029 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rcvq7"] Jan 28 19:00:16 crc kubenswrapper[4749]: E0128 19:00:16.636616 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" containerName="heat-cfnapi" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.636635 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" containerName="heat-cfnapi" Jan 28 19:00:16 crc kubenswrapper[4749]: E0128 19:00:16.636675 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" containerName="heat-api" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.636682 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" containerName="heat-api" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.636906 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" containerName="heat-api" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.636922 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" containerName="heat-cfnapi" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.636934 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" containerName="heat-cfnapi" Jan 28 19:00:16 crc kubenswrapper[4749]: E0128 19:00:16.637130 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" containerName="heat-api" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.637137 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" containerName="heat-api" Jan 28 19:00:16 crc kubenswrapper[4749]: E0128 19:00:16.637168 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" containerName="heat-cfnapi" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.637175 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dba594-df49-48ca-9cdf-5ccb4a76d74c" containerName="heat-cfnapi" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.637389 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdd109b-56e0-4cf0-a763-8573984ee415" containerName="heat-api" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.638576 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.656186 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcvq7"] Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.669849 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-catalog-content\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.670008 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrvd6\" (UniqueName: \"kubernetes.io/projected/f24b8786-67ac-48d4-ac79-106722f66977-kube-api-access-jrvd6\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.670061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-utilities\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.772317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-catalog-content\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.772876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrvd6\" (UniqueName: \"kubernetes.io/projected/f24b8786-67ac-48d4-ac79-106722f66977-kube-api-access-jrvd6\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.772901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-utilities\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.930986 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-catalog-content\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.931023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-utilities\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.935531 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:16 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:16 crc kubenswrapper[4749]: > Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.949209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrvd6\" (UniqueName: \"kubernetes.io/projected/f24b8786-67ac-48d4-ac79-106722f66977-kube-api-access-jrvd6\") pod \"certified-operators-rcvq7\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:16 crc kubenswrapper[4749]: I0128 19:00:16.966737 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:22 crc kubenswrapper[4749]: E0128 19:00:21.737315 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537 is running failed: container process not found" containerID="6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 19:00:22 crc kubenswrapper[4749]: E0128 19:00:21.738289 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537 is running failed: container process not found" containerID="6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 19:00:22 crc kubenswrapper[4749]: E0128 19:00:21.739263 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537 is running failed: container process not found" containerID="6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 19:00:22 crc kubenswrapper[4749]: E0128 19:00:21.739305 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-4rdwp" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="registry-server" Jan 28 19:00:22 crc kubenswrapper[4749]: I0128 19:00:22.723375 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcvq7"] Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.077477 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.167154 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w78fj\" (UniqueName: \"kubernetes.io/projected/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-kube-api-access-w78fj\") pod \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.167480 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-utilities\") pod \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.167561 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-catalog-content\") pod \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\" (UID: \"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1\") " Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.169923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-utilities" (OuterVolumeSpecName: "utilities") pod "44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" (UID: "44f085ce-f8ef-4f23-92f9-10f1cf5bbac1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:23 crc kubenswrapper[4749]: E0128 19:00:23.202663 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.208096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" (UID: "44f085ce-f8ef-4f23-92f9-10f1cf5bbac1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:23 crc kubenswrapper[4749]: E0128 19:00:23.208739 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 28 19:00:23 crc kubenswrapper[4749]: E0128 19:00:23.210539 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 28 19:00:23 crc kubenswrapper[4749]: E0128 19:00:23.210598 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-568fc79444-j9z48" podUID="47a65467-bac6-4a3c-b634-a3cbe9d282f3" containerName="heat-engine" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.221890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-kube-api-access-w78fj" (OuterVolumeSpecName: "kube-api-access-w78fj") pod "44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" (UID: "44f085ce-f8ef-4f23-92f9-10f1cf5bbac1"). InnerVolumeSpecName "kube-api-access-w78fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.271348 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.271394 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.271421 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w78fj\" (UniqueName: \"kubernetes.io/projected/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1-kube-api-access-w78fj\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.293940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rdwp" event={"ID":"44f085ce-f8ef-4f23-92f9-10f1cf5bbac1","Type":"ContainerDied","Data":"867e0329e8c76b080c3973010bf3947447300f85b4cf23dcbbf92bd91b2e6f5b"} Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.294003 4749 scope.go:117] "RemoveContainer" containerID="6cebf26cd52eaecbb6c8cf0df15a0becc9edd90cf65f83487cdfa08141f1b537" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.294193 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rdwp" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.304965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcvq7" event={"ID":"f24b8786-67ac-48d4-ac79-106722f66977","Type":"ContainerStarted","Data":"c1f2af4a10f91ba0da47a0ab463f9b92610df5ab8b57a8fdc0c2dbfdb5f55488"} Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.321895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pnwds" event={"ID":"d68f84a9-069c-4b25-a939-cd98ba9ab12b","Type":"ContainerStarted","Data":"e6287d955a75e17627fdac6dc5d624355ff29fbb5447022303ff0285d921fb70"} Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.362319 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rdwp"] Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.380608 4749 scope.go:117] "RemoveContainer" containerID="9c55ae714bddbfc9a23a95dfd2b1d4fcf2eadf9d0b585b1fb4241ff1dbea1850" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.382212 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rdwp"] Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.398826 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pnwds" podStartSLOduration=2.110518189 podStartE2EDuration="16.398803132s" podCreationTimestamp="2026-01-28 19:00:07 +0000 UTC" firstStartedPulling="2026-01-28 19:00:08.20076029 +0000 UTC m=+1476.212287065" lastFinishedPulling="2026-01-28 19:00:22.489045233 +0000 UTC m=+1490.500572008" observedRunningTime="2026-01-28 19:00:23.351873443 +0000 UTC m=+1491.363400228" watchObservedRunningTime="2026-01-28 19:00:23.398803132 +0000 UTC m=+1491.410329907" Jan 28 19:00:23 crc kubenswrapper[4749]: I0128 19:00:23.411985 4749 scope.go:117] "RemoveContainer" containerID="e0252959e510eb68239b1471cedb9131e129c7309d159863a0c309705ba775ac" Jan 28 19:00:24 crc kubenswrapper[4749]: I0128 19:00:24.374043 4749 generic.go:334] "Generic (PLEG): container finished" podID="f24b8786-67ac-48d4-ac79-106722f66977" containerID="06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0" exitCode=0 Jan 28 19:00:24 crc kubenswrapper[4749]: I0128 19:00:24.374103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcvq7" event={"ID":"f24b8786-67ac-48d4-ac79-106722f66977","Type":"ContainerDied","Data":"06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0"} Jan 28 19:00:24 crc kubenswrapper[4749]: I0128 19:00:24.384672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5nqn" event={"ID":"7aa7fd2d-ff29-4928-9e51-54ce06ec0028","Type":"ContainerStarted","Data":"c8c44fbc19b04c19f1560b66aa7bcea0e63364828b92ccfdf5c90020797a4386"} Jan 28 19:00:24 crc kubenswrapper[4749]: I0128 19:00:24.416000 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b5nqn" podStartSLOduration=5.752016161 podStartE2EDuration="20.415979682s" podCreationTimestamp="2026-01-28 19:00:04 +0000 UTC" firstStartedPulling="2026-01-28 19:00:07.954257086 +0000 UTC m=+1475.965783861" lastFinishedPulling="2026-01-28 19:00:22.618220607 +0000 UTC m=+1490.629747382" observedRunningTime="2026-01-28 19:00:24.414865574 +0000 UTC m=+1492.426392359" watchObservedRunningTime="2026-01-28 19:00:24.415979682 +0000 UTC m=+1492.427506467" Jan 28 19:00:24 crc kubenswrapper[4749]: I0128 19:00:24.883993 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" path="/var/lib/kubelet/pods/44f085ce-f8ef-4f23-92f9-10f1cf5bbac1/volumes" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.269391 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.269779 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.418502 4749 generic.go:334] "Generic (PLEG): container finished" podID="26a1df38-b5be-4122-a907-fadff9ea2487" containerID="480f46ce4e474b47094116652cd0aa94cc8ea71c261898a28da6379393ce8a68" exitCode=137 Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.418620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerDied","Data":"480f46ce4e474b47094116652cd0aa94cc8ea71c261898a28da6379393ce8a68"} Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.698249 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.750198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-combined-ca-bundle\") pod \"26a1df38-b5be-4122-a907-fadff9ea2487\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.750557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-log-httpd\") pod \"26a1df38-b5be-4122-a907-fadff9ea2487\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.750748 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-run-httpd\") pod \"26a1df38-b5be-4122-a907-fadff9ea2487\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.750791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-scripts\") pod \"26a1df38-b5be-4122-a907-fadff9ea2487\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.750886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l25dp\" (UniqueName: \"kubernetes.io/projected/26a1df38-b5be-4122-a907-fadff9ea2487-kube-api-access-l25dp\") pod \"26a1df38-b5be-4122-a907-fadff9ea2487\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.750940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-config-data\") pod \"26a1df38-b5be-4122-a907-fadff9ea2487\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.750985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-sg-core-conf-yaml\") pod \"26a1df38-b5be-4122-a907-fadff9ea2487\" (UID: \"26a1df38-b5be-4122-a907-fadff9ea2487\") " Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.751432 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "26a1df38-b5be-4122-a907-fadff9ea2487" (UID: "26a1df38-b5be-4122-a907-fadff9ea2487"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.751679 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.751679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "26a1df38-b5be-4122-a907-fadff9ea2487" (UID: "26a1df38-b5be-4122-a907-fadff9ea2487"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.765559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-scripts" (OuterVolumeSpecName: "scripts") pod "26a1df38-b5be-4122-a907-fadff9ea2487" (UID: "26a1df38-b5be-4122-a907-fadff9ea2487"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.789618 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a1df38-b5be-4122-a907-fadff9ea2487-kube-api-access-l25dp" (OuterVolumeSpecName: "kube-api-access-l25dp") pod "26a1df38-b5be-4122-a907-fadff9ea2487" (UID: "26a1df38-b5be-4122-a907-fadff9ea2487"). InnerVolumeSpecName "kube-api-access-l25dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.809883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "26a1df38-b5be-4122-a907-fadff9ea2487" (UID: "26a1df38-b5be-4122-a907-fadff9ea2487"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.858958 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26a1df38-b5be-4122-a907-fadff9ea2487-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.858997 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.859007 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l25dp\" (UniqueName: \"kubernetes.io/projected/26a1df38-b5be-4122-a907-fadff9ea2487-kube-api-access-l25dp\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:25 crc kubenswrapper[4749]: I0128 19:00:25.859018 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.047567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-config-data" (OuterVolumeSpecName: "config-data") pod "26a1df38-b5be-4122-a907-fadff9ea2487" (UID: "26a1df38-b5be-4122-a907-fadff9ea2487"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.055310 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26a1df38-b5be-4122-a907-fadff9ea2487" (UID: "26a1df38-b5be-4122-a907-fadff9ea2487"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.062906 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.062948 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a1df38-b5be-4122-a907-fadff9ea2487-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.339552 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b5nqn" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:26 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:26 crc kubenswrapper[4749]: > Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.432997 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.432956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26a1df38-b5be-4122-a907-fadff9ea2487","Type":"ContainerDied","Data":"b43583f2792bc6eb994e377cff59cfeea2b5b9c70b0de3a462e37662114ecacb"} Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.433501 4749 scope.go:117] "RemoveContainer" containerID="480f46ce4e474b47094116652cd0aa94cc8ea71c261898a28da6379393ce8a68" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.461438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcvq7" event={"ID":"f24b8786-67ac-48d4-ac79-106722f66977","Type":"ContainerStarted","Data":"b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29"} Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.504794 4749 scope.go:117] "RemoveContainer" containerID="1f5d2ba6e0f4233f2795d3ab8b6db4e9c2f3b04c37e17a7039e7728f385ccdc9" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.545763 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.559762 4749 scope.go:117] "RemoveContainer" containerID="28377c98a249b2dbc6102f099da44395b0e5123ddd994b459942eb436a3d6eaf" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.564653 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.586675 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:26 crc kubenswrapper[4749]: E0128 19:00:26.587192 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="extract-utilities" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587207 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="extract-utilities" Jan 28 19:00:26 crc kubenswrapper[4749]: E0128 19:00:26.587222 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="proxy-httpd" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587229 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="proxy-httpd" Jan 28 19:00:26 crc kubenswrapper[4749]: E0128 19:00:26.587250 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="extract-content" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587258 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="extract-content" Jan 28 19:00:26 crc kubenswrapper[4749]: E0128 19:00:26.587275 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="sg-core" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587282 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="sg-core" Jan 28 19:00:26 crc kubenswrapper[4749]: E0128 19:00:26.587293 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="ceilometer-notification-agent" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587299 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="ceilometer-notification-agent" Jan 28 19:00:26 crc kubenswrapper[4749]: E0128 19:00:26.587315 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="registry-server" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587321 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="registry-server" Jan 28 19:00:26 crc kubenswrapper[4749]: E0128 19:00:26.587429 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="ceilometer-central-agent" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587437 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="ceilometer-central-agent" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587670 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="ceilometer-notification-agent" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587693 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="sg-core" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587707 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="ceilometer-central-agent" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587720 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f085ce-f8ef-4f23-92f9-10f1cf5bbac1" containerName="registry-server" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.587740 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" containerName="proxy-httpd" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.589895 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.589976 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.596905 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.597694 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.671425 4749 scope.go:117] "RemoveContainer" containerID="81f41dea5270f00febe5e1418c94c60437f9b80add2450a56e984c8a68e0a319" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.699272 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-scripts\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.699378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-config-data\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.699411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.699705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmbrl\" (UniqueName: \"kubernetes.io/projected/d88f8585-8424-4d7b-807f-07c2bc469ffb-kube-api-access-hmbrl\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.699922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-run-httpd\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.699987 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-log-httpd\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.700089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.801924 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.802046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-scripts\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.802099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-config-data\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.802126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.802179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbrl\" (UniqueName: \"kubernetes.io/projected/d88f8585-8424-4d7b-807f-07c2bc469ffb-kube-api-access-hmbrl\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.802234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-run-httpd\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.802255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-log-httpd\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.802937 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-run-httpd\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.803066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-log-httpd\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.813151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-scripts\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.814173 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.814359 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.814447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-config-data\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.824776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbrl\" (UniqueName: \"kubernetes.io/projected/d88f8585-8424-4d7b-807f-07c2bc469ffb-kube-api-access-hmbrl\") pod \"ceilometer-0\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " pod="openstack/ceilometer-0" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.896873 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26a1df38-b5be-4122-a907-fadff9ea2487" path="/var/lib/kubelet/pods/26a1df38-b5be-4122-a907-fadff9ea2487/volumes" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.911291 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:26 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:26 crc kubenswrapper[4749]: > Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.911405 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.912603 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"4c791ddf7b0951e536637ecce5ac329ade0090d4a43dab367ede7e34c5c4e425"} pod="openshift-marketplace/redhat-operators-n5mr8" containerMessage="Container registry-server failed startup probe, will be restarted" Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.912688 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" containerID="cri-o://4c791ddf7b0951e536637ecce5ac329ade0090d4a43dab367ede7e34c5c4e425" gracePeriod=30 Jan 28 19:00:26 crc kubenswrapper[4749]: I0128 19:00:26.922554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.432939 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.466856 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.466911 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.482750 4749 generic.go:334] "Generic (PLEG): container finished" podID="47a65467-bac6-4a3c-b634-a3cbe9d282f3" containerID="2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8" exitCode=0 Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.483005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-568fc79444-j9z48" event={"ID":"47a65467-bac6-4a3c-b634-a3cbe9d282f3","Type":"ContainerDied","Data":"2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8"} Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.483102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-568fc79444-j9z48" event={"ID":"47a65467-bac6-4a3c-b634-a3cbe9d282f3","Type":"ContainerDied","Data":"7966ac58ea5be4cd95b529000768cfc73c71a776135ce122580d4ff253b068bb"} Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.483135 4749 scope.go:117] "RemoveContainer" containerID="2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.483461 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-568fc79444-j9z48" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.513771 4749 scope.go:117] "RemoveContainer" containerID="2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8" Jan 28 19:00:27 crc kubenswrapper[4749]: E0128 19:00:27.514192 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8\": container with ID starting with 2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8 not found: ID does not exist" containerID="2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.514238 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8"} err="failed to get container status \"2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8\": rpc error: code = NotFound desc = could not find container \"2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8\": container with ID starting with 2cd3a75afd29e34c55c2cb59adbb1b31f41d0b1e31c9a8e940cbb533cbcdc3b8 not found: ID does not exist" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.516092 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.516867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z5xx\" (UniqueName: \"kubernetes.io/projected/47a65467-bac6-4a3c-b634-a3cbe9d282f3-kube-api-access-7z5xx\") pod \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.517000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-combined-ca-bundle\") pod \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.517161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data\") pod \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.517213 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data-custom\") pod \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\" (UID: \"47a65467-bac6-4a3c-b634-a3cbe9d282f3\") " Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.522718 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "47a65467-bac6-4a3c-b634-a3cbe9d282f3" (UID: "47a65467-bac6-4a3c-b634-a3cbe9d282f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.527049 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a65467-bac6-4a3c-b634-a3cbe9d282f3-kube-api-access-7z5xx" (OuterVolumeSpecName: "kube-api-access-7z5xx") pod "47a65467-bac6-4a3c-b634-a3cbe9d282f3" (UID: "47a65467-bac6-4a3c-b634-a3cbe9d282f3"). InnerVolumeSpecName "kube-api-access-7z5xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.558294 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47a65467-bac6-4a3c-b634-a3cbe9d282f3" (UID: "47a65467-bac6-4a3c-b634-a3cbe9d282f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.625418 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.625463 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z5xx\" (UniqueName: \"kubernetes.io/projected/47a65467-bac6-4a3c-b634-a3cbe9d282f3-kube-api-access-7z5xx\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.625479 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.633628 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data" (OuterVolumeSpecName: "config-data") pod "47a65467-bac6-4a3c-b634-a3cbe9d282f3" (UID: "47a65467-bac6-4a3c-b634-a3cbe9d282f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.727399 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a65467-bac6-4a3c-b634-a3cbe9d282f3-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.833589 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-568fc79444-j9z48"] Jan 28 19:00:27 crc kubenswrapper[4749]: I0128 19:00:27.849205 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-568fc79444-j9z48"] Jan 28 19:00:28 crc kubenswrapper[4749]: I0128 19:00:28.498767 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerID="4c791ddf7b0951e536637ecce5ac329ade0090d4a43dab367ede7e34c5c4e425" exitCode=0 Jan 28 19:00:28 crc kubenswrapper[4749]: I0128 19:00:28.498845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerDied","Data":"4c791ddf7b0951e536637ecce5ac329ade0090d4a43dab367ede7e34c5c4e425"} Jan 28 19:00:28 crc kubenswrapper[4749]: I0128 19:00:28.501287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerStarted","Data":"aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a"} Jan 28 19:00:28 crc kubenswrapper[4749]: I0128 19:00:28.501377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerStarted","Data":"a9130a1d283568a358cc3a3a6e06e7b0cba416718830ee920fc187af79f5c5a9"} Jan 28 19:00:28 crc kubenswrapper[4749]: I0128 19:00:28.503624 4749 generic.go:334] "Generic (PLEG): container finished" podID="f24b8786-67ac-48d4-ac79-106722f66977" containerID="b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29" exitCode=0 Jan 28 19:00:28 crc kubenswrapper[4749]: I0128 19:00:28.503693 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcvq7" event={"ID":"f24b8786-67ac-48d4-ac79-106722f66977","Type":"ContainerDied","Data":"b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29"} Jan 28 19:00:28 crc kubenswrapper[4749]: I0128 19:00:28.908423 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a65467-bac6-4a3c-b634-a3cbe9d282f3" path="/var/lib/kubelet/pods/47a65467-bac6-4a3c-b634-a3cbe9d282f3/volumes" Jan 28 19:00:32 crc kubenswrapper[4749]: I0128 19:00:32.551424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerStarted","Data":"33ca4c7b511939cde9c2d3023340f2eb8af79c630f3f9f74f5285f25d1b5d214"} Jan 28 19:00:35 crc kubenswrapper[4749]: I0128 19:00:35.586901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerStarted","Data":"29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2"} Jan 28 19:00:35 crc kubenswrapper[4749]: I0128 19:00:35.594462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcvq7" event={"ID":"f24b8786-67ac-48d4-ac79-106722f66977","Type":"ContainerStarted","Data":"4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9"} Jan 28 19:00:35 crc kubenswrapper[4749]: I0128 19:00:35.613988 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rcvq7" podStartSLOduration=9.38244164 podStartE2EDuration="19.613966239s" podCreationTimestamp="2026-01-28 19:00:16 +0000 UTC" firstStartedPulling="2026-01-28 19:00:24.378372376 +0000 UTC m=+1492.389899151" lastFinishedPulling="2026-01-28 19:00:34.609896975 +0000 UTC m=+1502.621423750" observedRunningTime="2026-01-28 19:00:35.610462363 +0000 UTC m=+1503.621989158" watchObservedRunningTime="2026-01-28 19:00:35.613966239 +0000 UTC m=+1503.625493004" Jan 28 19:00:35 crc kubenswrapper[4749]: I0128 19:00:35.843952 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 19:00:35 crc kubenswrapper[4749]: I0128 19:00:35.843993 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 19:00:36 crc kubenswrapper[4749]: I0128 19:00:36.333357 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b5nqn" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:36 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:36 crc kubenswrapper[4749]: > Jan 28 19:00:36 crc kubenswrapper[4749]: I0128 19:00:36.606035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerStarted","Data":"a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c"} Jan 28 19:00:36 crc kubenswrapper[4749]: I0128 19:00:36.915040 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:36 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:36 crc kubenswrapper[4749]: > Jan 28 19:00:36 crc kubenswrapper[4749]: I0128 19:00:36.967026 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:36 crc kubenswrapper[4749]: I0128 19:00:36.967071 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:38 crc kubenswrapper[4749]: I0128 19:00:38.025311 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rcvq7" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:38 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:38 crc kubenswrapper[4749]: > Jan 28 19:00:38 crc kubenswrapper[4749]: I0128 19:00:38.632617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerStarted","Data":"6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143"} Jan 28 19:00:38 crc kubenswrapper[4749]: I0128 19:00:38.632883 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 19:00:39 crc kubenswrapper[4749]: I0128 19:00:39.020896 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.124934437 podStartE2EDuration="13.020876345s" podCreationTimestamp="2026-01-28 19:00:26 +0000 UTC" firstStartedPulling="2026-01-28 19:00:27.531401705 +0000 UTC m=+1495.542928480" lastFinishedPulling="2026-01-28 19:00:37.427343613 +0000 UTC m=+1505.438870388" observedRunningTime="2026-01-28 19:00:38.661942014 +0000 UTC m=+1506.673468809" watchObservedRunningTime="2026-01-28 19:00:39.020876345 +0000 UTC m=+1507.032403120" Jan 28 19:00:39 crc kubenswrapper[4749]: I0128 19:00:39.027052 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:40 crc kubenswrapper[4749]: I0128 19:00:40.654200 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="ceilometer-central-agent" containerID="cri-o://aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a" gracePeriod=30 Jan 28 19:00:40 crc kubenswrapper[4749]: I0128 19:00:40.654236 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="proxy-httpd" containerID="cri-o://6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143" gracePeriod=30 Jan 28 19:00:40 crc kubenswrapper[4749]: I0128 19:00:40.654243 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="sg-core" containerID="cri-o://a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c" gracePeriod=30 Jan 28 19:00:40 crc kubenswrapper[4749]: I0128 19:00:40.654297 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="ceilometer-notification-agent" containerID="cri-o://29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2" gracePeriod=30 Jan 28 19:00:41 crc kubenswrapper[4749]: E0128 19:00:41.156054 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd88f8585_8424_4d7b_807f_07c2bc469ffb.slice/crio-conmon-29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2.scope\": RecentStats: unable to find data in memory cache]" Jan 28 19:00:41 crc kubenswrapper[4749]: I0128 19:00:41.667829 4749 generic.go:334] "Generic (PLEG): container finished" podID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerID="6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143" exitCode=0 Jan 28 19:00:41 crc kubenswrapper[4749]: I0128 19:00:41.667869 4749 generic.go:334] "Generic (PLEG): container finished" podID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerID="a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c" exitCode=2 Jan 28 19:00:41 crc kubenswrapper[4749]: I0128 19:00:41.667877 4749 generic.go:334] "Generic (PLEG): container finished" podID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerID="29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2" exitCode=0 Jan 28 19:00:41 crc kubenswrapper[4749]: I0128 19:00:41.667895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerDied","Data":"6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143"} Jan 28 19:00:41 crc kubenswrapper[4749]: I0128 19:00:41.667924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerDied","Data":"a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c"} Jan 28 19:00:41 crc kubenswrapper[4749]: I0128 19:00:41.667938 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerDied","Data":"29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2"} Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.331131 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b5nqn" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:46 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:46 crc kubenswrapper[4749]: > Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.341820 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.520974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-combined-ca-bundle\") pod \"d88f8585-8424-4d7b-807f-07c2bc469ffb\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.521075 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-log-httpd\") pod \"d88f8585-8424-4d7b-807f-07c2bc469ffb\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.521122 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-scripts\") pod \"d88f8585-8424-4d7b-807f-07c2bc469ffb\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.521259 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-config-data\") pod \"d88f8585-8424-4d7b-807f-07c2bc469ffb\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.521318 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmbrl\" (UniqueName: \"kubernetes.io/projected/d88f8585-8424-4d7b-807f-07c2bc469ffb-kube-api-access-hmbrl\") pod \"d88f8585-8424-4d7b-807f-07c2bc469ffb\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.521438 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-sg-core-conf-yaml\") pod \"d88f8585-8424-4d7b-807f-07c2bc469ffb\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.521476 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-run-httpd\") pod \"d88f8585-8424-4d7b-807f-07c2bc469ffb\" (UID: \"d88f8585-8424-4d7b-807f-07c2bc469ffb\") " Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.521559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d88f8585-8424-4d7b-807f-07c2bc469ffb" (UID: "d88f8585-8424-4d7b-807f-07c2bc469ffb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.521897 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d88f8585-8424-4d7b-807f-07c2bc469ffb" (UID: "d88f8585-8424-4d7b-807f-07c2bc469ffb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.522231 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.522252 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d88f8585-8424-4d7b-807f-07c2bc469ffb-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.528588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-scripts" (OuterVolumeSpecName: "scripts") pod "d88f8585-8424-4d7b-807f-07c2bc469ffb" (UID: "d88f8585-8424-4d7b-807f-07c2bc469ffb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.529214 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d88f8585-8424-4d7b-807f-07c2bc469ffb-kube-api-access-hmbrl" (OuterVolumeSpecName: "kube-api-access-hmbrl") pod "d88f8585-8424-4d7b-807f-07c2bc469ffb" (UID: "d88f8585-8424-4d7b-807f-07c2bc469ffb"). InnerVolumeSpecName "kube-api-access-hmbrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.579947 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d88f8585-8424-4d7b-807f-07c2bc469ffb" (UID: "d88f8585-8424-4d7b-807f-07c2bc469ffb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.624618 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmbrl\" (UniqueName: \"kubernetes.io/projected/d88f8585-8424-4d7b-807f-07c2bc469ffb-kube-api-access-hmbrl\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.624964 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.625050 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.634183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d88f8585-8424-4d7b-807f-07c2bc469ffb" (UID: "d88f8585-8424-4d7b-807f-07c2bc469ffb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.649923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-config-data" (OuterVolumeSpecName: "config-data") pod "d88f8585-8424-4d7b-807f-07c2bc469ffb" (UID: "d88f8585-8424-4d7b-807f-07c2bc469ffb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.722289 4749 generic.go:334] "Generic (PLEG): container finished" podID="d68f84a9-069c-4b25-a939-cd98ba9ab12b" containerID="e6287d955a75e17627fdac6dc5d624355ff29fbb5447022303ff0285d921fb70" exitCode=0 Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.722440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pnwds" event={"ID":"d68f84a9-069c-4b25-a939-cd98ba9ab12b","Type":"ContainerDied","Data":"e6287d955a75e17627fdac6dc5d624355ff29fbb5447022303ff0285d921fb70"} Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.724850 4749 generic.go:334] "Generic (PLEG): container finished" podID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerID="aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a" exitCode=0 Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.725004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerDied","Data":"aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a"} Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.725056 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d88f8585-8424-4d7b-807f-07c2bc469ffb","Type":"ContainerDied","Data":"a9130a1d283568a358cc3a3a6e06e7b0cba416718830ee920fc187af79f5c5a9"} Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.725074 4749 scope.go:117] "RemoveContainer" containerID="6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.725186 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.726758 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.726788 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88f8585-8424-4d7b-807f-07c2bc469ffb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.751202 4749 scope.go:117] "RemoveContainer" containerID="a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.773117 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.779127 4749 scope.go:117] "RemoveContainer" containerID="29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.789664 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.810598 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.811216 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a65467-bac6-4a3c-b634-a3cbe9d282f3" containerName="heat-engine" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.811293 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a65467-bac6-4a3c-b634-a3cbe9d282f3" containerName="heat-engine" Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.811715 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="ceilometer-notification-agent" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.811801 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="ceilometer-notification-agent" Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.811893 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="proxy-httpd" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.811950 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="proxy-httpd" Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.812034 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="ceilometer-central-agent" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.812089 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="ceilometer-central-agent" Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.812171 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="sg-core" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.812230 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="sg-core" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.812501 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="sg-core" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.812589 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="ceilometer-notification-agent" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.812649 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="ceilometer-central-agent" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.812710 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a65467-bac6-4a3c-b634-a3cbe9d282f3" containerName="heat-engine" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.812777 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" containerName="proxy-httpd" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.814875 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.815064 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.824566 4749 scope.go:117] "RemoveContainer" containerID="aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.827932 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.828073 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.875515 4749 scope.go:117] "RemoveContainer" containerID="6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143" Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.877512 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143\": container with ID starting with 6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143 not found: ID does not exist" containerID="6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.877647 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143"} err="failed to get container status \"6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143\": rpc error: code = NotFound desc = could not find container \"6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143\": container with ID starting with 6b8029fb66087ee676066c41b3bc720ee71dd00045fb0553a392e67a9ea1b143 not found: ID does not exist" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.877747 4749 scope.go:117] "RemoveContainer" containerID="a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c" Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.880940 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c\": container with ID starting with a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c not found: ID does not exist" containerID="a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.880981 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c"} err="failed to get container status \"a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c\": rpc error: code = NotFound desc = could not find container \"a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c\": container with ID starting with a4028fb6e426ebd9d4ae1f667551c4a6a2594639ac0ed0fce5e5f127efea209c not found: ID does not exist" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.881003 4749 scope.go:117] "RemoveContainer" containerID="29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2" Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.881976 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2\": container with ID starting with 29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2 not found: ID does not exist" containerID="29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.881998 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2"} err="failed to get container status \"29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2\": rpc error: code = NotFound desc = could not find container \"29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2\": container with ID starting with 29c971851759e8f65cff308e05623df7805e9b03c10baf6432d40099b255e6d2 not found: ID does not exist" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.882011 4749 scope.go:117] "RemoveContainer" containerID="aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a" Jan 28 19:00:46 crc kubenswrapper[4749]: E0128 19:00:46.882426 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a\": container with ID starting with aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a not found: ID does not exist" containerID="aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.882452 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a"} err="failed to get container status \"aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a\": rpc error: code = NotFound desc = could not find container \"aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a\": container with ID starting with aebdec36b2e449765710e274ef1bff6fed4e5415551a9a893f54eaa19a98057a not found: ID does not exist" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.896287 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d88f8585-8424-4d7b-807f-07c2bc469ffb" path="/var/lib/kubelet/pods/d88f8585-8424-4d7b-807f-07c2bc469ffb/volumes" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.898312 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:46 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:46 crc kubenswrapper[4749]: > Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.939835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.939889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlzmg\" (UniqueName: \"kubernetes.io/projected/e08c55a7-aec7-463e-8567-6afe8c166fcb-kube-api-access-jlzmg\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.940090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-scripts\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.940174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-log-httpd\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.940554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-run-httpd\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.940590 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-config-data\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:46 crc kubenswrapper[4749]: I0128 19:00:46.940656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.043359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.043437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlzmg\" (UniqueName: \"kubernetes.io/projected/e08c55a7-aec7-463e-8567-6afe8c166fcb-kube-api-access-jlzmg\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.043532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-scripts\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.043654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-log-httpd\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.044246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-log-httpd\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.044607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-run-httpd\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.044641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-config-data\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.044988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-run-httpd\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.045380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.048403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.048588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-scripts\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.050269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-config-data\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.050942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.062682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlzmg\" (UniqueName: \"kubernetes.io/projected/e08c55a7-aec7-463e-8567-6afe8c166fcb-kube-api-access-jlzmg\") pod \"ceilometer-0\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.147541 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:00:47 crc kubenswrapper[4749]: W0128 19:00:47.612040 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode08c55a7_aec7_463e_8567_6afe8c166fcb.slice/crio-a5d143bc0a4c3ed117851ba51ee2c7ba41b3f212ad751f4340453dac70019bd9 WatchSource:0}: Error finding container a5d143bc0a4c3ed117851ba51ee2c7ba41b3f212ad751f4340453dac70019bd9: Status 404 returned error can't find the container with id a5d143bc0a4c3ed117851ba51ee2c7ba41b3f212ad751f4340453dac70019bd9 Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.614888 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:47 crc kubenswrapper[4749]: I0128 19:00:47.738759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerStarted","Data":"a5d143bc0a4c3ed117851ba51ee2c7ba41b3f212ad751f4340453dac70019bd9"} Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.023684 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rcvq7" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:48 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:48 crc kubenswrapper[4749]: > Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.240385 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.276896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-scripts\") pod \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.277698 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-combined-ca-bundle\") pod \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.277789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5srf\" (UniqueName: \"kubernetes.io/projected/d68f84a9-069c-4b25-a939-cd98ba9ab12b-kube-api-access-m5srf\") pod \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.278208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-config-data\") pod \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\" (UID: \"d68f84a9-069c-4b25-a939-cd98ba9ab12b\") " Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.298548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d68f84a9-069c-4b25-a939-cd98ba9ab12b-kube-api-access-m5srf" (OuterVolumeSpecName: "kube-api-access-m5srf") pod "d68f84a9-069c-4b25-a939-cd98ba9ab12b" (UID: "d68f84a9-069c-4b25-a939-cd98ba9ab12b"). InnerVolumeSpecName "kube-api-access-m5srf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.313973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-scripts" (OuterVolumeSpecName: "scripts") pod "d68f84a9-069c-4b25-a939-cd98ba9ab12b" (UID: "d68f84a9-069c-4b25-a939-cd98ba9ab12b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.381577 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5srf\" (UniqueName: \"kubernetes.io/projected/d68f84a9-069c-4b25-a939-cd98ba9ab12b-kube-api-access-m5srf\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.381621 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.389565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-config-data" (OuterVolumeSpecName: "config-data") pod "d68f84a9-069c-4b25-a939-cd98ba9ab12b" (UID: "d68f84a9-069c-4b25-a939-cd98ba9ab12b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.392561 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d68f84a9-069c-4b25-a939-cd98ba9ab12b" (UID: "d68f84a9-069c-4b25-a939-cd98ba9ab12b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.483155 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.483210 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68f84a9-069c-4b25-a939-cd98ba9ab12b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.761202 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerStarted","Data":"4f9157dd50c612a7ce0631afec387a6a72dc11eb5d5a558ebdef1621723cfd9c"} Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.764733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pnwds" event={"ID":"d68f84a9-069c-4b25-a939-cd98ba9ab12b","Type":"ContainerDied","Data":"e6cc7a99716f35cb21bc930ffba1a728e078a38e1db66a0b4a645b7941354933"} Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.764764 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6cc7a99716f35cb21bc930ffba1a728e078a38e1db66a0b4a645b7941354933" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.764819 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pnwds" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.956674 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:00:48 crc kubenswrapper[4749]: E0128 19:00:48.957224 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68f84a9-069c-4b25-a939-cd98ba9ab12b" containerName="nova-cell0-conductor-db-sync" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.957267 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68f84a9-069c-4b25-a939-cd98ba9ab12b" containerName="nova-cell0-conductor-db-sync" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.957490 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d68f84a9-069c-4b25-a939-cd98ba9ab12b" containerName="nova-cell0-conductor-db-sync" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.958340 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.961775 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-stslj" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.961963 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 19:00:48 crc kubenswrapper[4749]: I0128 19:00:48.974931 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.078887 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:00:49 crc kubenswrapper[4749]: E0128 19:00:49.081339 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-dxqnc], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell0-conductor-0" podUID="3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.099101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.099732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.099785 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxqnc\" (UniqueName: \"kubernetes.io/projected/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-kube-api-access-dxqnc\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.201892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.202093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.202119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxqnc\" (UniqueName: \"kubernetes.io/projected/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-kube-api-access-dxqnc\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.207269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.212842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.213008 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.245023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxqnc\" (UniqueName: \"kubernetes.io/projected/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-kube-api-access-dxqnc\") pod \"nova-cell0-conductor-0\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.775665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerStarted","Data":"07a457c0188557adfaa96cc458ff67b545a4585df69e282bea963e19f3662609"} Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.775685 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:49 crc kubenswrapper[4749]: I0128 19:00:49.834348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.022074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-config-data\") pod \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.022902 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-combined-ca-bundle\") pod \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.023045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxqnc\" (UniqueName: \"kubernetes.io/projected/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-kube-api-access-dxqnc\") pod \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\" (UID: \"3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95\") " Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.028538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-kube-api-access-dxqnc" (OuterVolumeSpecName: "kube-api-access-dxqnc") pod "3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95" (UID: "3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95"). InnerVolumeSpecName "kube-api-access-dxqnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.029092 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95" (UID: "3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.029440 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-config-data" (OuterVolumeSpecName: "config-data") pod "3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95" (UID: "3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.125080 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxqnc\" (UniqueName: \"kubernetes.io/projected/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-kube-api-access-dxqnc\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.125113 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.125123 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.788170 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.788362 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerStarted","Data":"633f9ee9574627558a2d6b946515b84496c214efeb0f643a987ba3459f9cae95"} Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.844633 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.867289 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.917123 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95" path="/var/lib/kubelet/pods/3e50bf2c-0c86-4fd0-a3cf-2a8e8e945f95/volumes" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.917640 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.920868 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.920977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.925195 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 19:00:50 crc kubenswrapper[4749]: I0128 19:00:50.925417 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-stslj" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.050424 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blf4z\" (UniqueName: \"kubernetes.io/projected/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-kube-api-access-blf4z\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.050677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.050838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.153478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blf4z\" (UniqueName: \"kubernetes.io/projected/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-kube-api-access-blf4z\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.153817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.154001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.158478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.159463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.174675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blf4z\" (UniqueName: \"kubernetes.io/projected/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-kube-api-access-blf4z\") pod \"nova-cell0-conductor-0\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.257079 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.774203 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:00:51 crc kubenswrapper[4749]: W0128 19:00:51.779510 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcffbcc50_b658_4bc3_afcb_b037f0a4b6ec.slice/crio-6db9b9bd7566ce99177d22bb4c98ee5fc9228f8e69d622df66d724371f6ea1d2 WatchSource:0}: Error finding container 6db9b9bd7566ce99177d22bb4c98ee5fc9228f8e69d622df66d724371f6ea1d2: Status 404 returned error can't find the container with id 6db9b9bd7566ce99177d22bb4c98ee5fc9228f8e69d622df66d724371f6ea1d2 Jan 28 19:00:51 crc kubenswrapper[4749]: I0128 19:00:51.801555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec","Type":"ContainerStarted","Data":"6db9b9bd7566ce99177d22bb4c98ee5fc9228f8e69d622df66d724371f6ea1d2"} Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.834076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec","Type":"ContainerStarted","Data":"ffa38e46c426004907460b26c4c2e56146d7e9a53299a6a7c4a9382ee037a0de"} Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.834750 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.838080 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerStarted","Data":"e1b383e112ca3d31ce6fed2ad8e8e2d0d7bc1e67c9366b3a591db1ad2aed47ee"} Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.838300 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="ceilometer-central-agent" containerID="cri-o://4f9157dd50c612a7ce0631afec387a6a72dc11eb5d5a558ebdef1621723cfd9c" gracePeriod=30 Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.838622 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.838683 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="proxy-httpd" containerID="cri-o://e1b383e112ca3d31ce6fed2ad8e8e2d0d7bc1e67c9366b3a591db1ad2aed47ee" gracePeriod=30 Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.838745 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="sg-core" containerID="cri-o://633f9ee9574627558a2d6b946515b84496c214efeb0f643a987ba3459f9cae95" gracePeriod=30 Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.838786 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="ceilometer-notification-agent" containerID="cri-o://07a457c0188557adfaa96cc458ff67b545a4585df69e282bea963e19f3662609" gracePeriod=30 Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.856661 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.8566375600000002 podStartE2EDuration="2.85663756s" podCreationTimestamp="2026-01-28 19:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:52.85177988 +0000 UTC m=+1520.863306665" watchObservedRunningTime="2026-01-28 19:00:52.85663756 +0000 UTC m=+1520.868164365" Jan 28 19:00:52 crc kubenswrapper[4749]: I0128 19:00:52.905895 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.869420194 podStartE2EDuration="6.905861055s" podCreationTimestamp="2026-01-28 19:00:46 +0000 UTC" firstStartedPulling="2026-01-28 19:00:47.615703927 +0000 UTC m=+1515.627230702" lastFinishedPulling="2026-01-28 19:00:51.652144788 +0000 UTC m=+1519.663671563" observedRunningTime="2026-01-28 19:00:52.88956687 +0000 UTC m=+1520.901093685" watchObservedRunningTime="2026-01-28 19:00:52.905861055 +0000 UTC m=+1520.917387840" Jan 28 19:00:53 crc kubenswrapper[4749]: I0128 19:00:53.852772 4749 generic.go:334] "Generic (PLEG): container finished" podID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerID="e1b383e112ca3d31ce6fed2ad8e8e2d0d7bc1e67c9366b3a591db1ad2aed47ee" exitCode=0 Jan 28 19:00:53 crc kubenswrapper[4749]: I0128 19:00:53.853143 4749 generic.go:334] "Generic (PLEG): container finished" podID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerID="633f9ee9574627558a2d6b946515b84496c214efeb0f643a987ba3459f9cae95" exitCode=2 Jan 28 19:00:53 crc kubenswrapper[4749]: I0128 19:00:53.853157 4749 generic.go:334] "Generic (PLEG): container finished" podID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerID="07a457c0188557adfaa96cc458ff67b545a4585df69e282bea963e19f3662609" exitCode=0 Jan 28 19:00:53 crc kubenswrapper[4749]: I0128 19:00:53.852823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerDied","Data":"e1b383e112ca3d31ce6fed2ad8e8e2d0d7bc1e67c9366b3a591db1ad2aed47ee"} Jan 28 19:00:53 crc kubenswrapper[4749]: I0128 19:00:53.853245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerDied","Data":"633f9ee9574627558a2d6b946515b84496c214efeb0f643a987ba3459f9cae95"} Jan 28 19:00:53 crc kubenswrapper[4749]: I0128 19:00:53.853259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerDied","Data":"07a457c0188557adfaa96cc458ff67b545a4585df69e282bea963e19f3662609"} Jan 28 19:00:55 crc kubenswrapper[4749]: I0128 19:00:55.321240 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:55 crc kubenswrapper[4749]: I0128 19:00:55.380039 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:55 crc kubenswrapper[4749]: I0128 19:00:55.562171 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5nqn"] Jan 28 19:00:56 crc kubenswrapper[4749]: I0128 19:00:56.291944 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 28 19:00:56 crc kubenswrapper[4749]: I0128 19:00:56.892098 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dhv8b"] Jan 28 19:00:56 crc kubenswrapper[4749]: I0128 19:00:56.894003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:56 crc kubenswrapper[4749]: I0128 19:00:56.898840 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 19:00:56 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:00:56 crc kubenswrapper[4749]: > Jan 28 19:00:56 crc kubenswrapper[4749]: I0128 19:00:56.900296 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 28 19:00:56 crc kubenswrapper[4749]: I0128 19:00:56.903041 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 28 19:00:56 crc kubenswrapper[4749]: I0128 19:00:56.904451 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dhv8b"] Jan 28 19:00:56 crc kubenswrapper[4749]: I0128 19:00:56.984083 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b5nqn" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="registry-server" containerID="cri-o://c8c44fbc19b04c19f1560b66aa7bcea0e63364828b92ccfdf5c90020797a4386" gracePeriod=2 Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.006285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.006607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-config-data\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.006646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-scripts\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.006678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6xc5\" (UniqueName: \"kubernetes.io/projected/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-kube-api-access-s6xc5\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.070401 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.109338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.109731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-config-data\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.121703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-scripts\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.121773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6xc5\" (UniqueName: \"kubernetes.io/projected/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-kube-api-access-s6xc5\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.130437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-config-data\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.133990 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.145891 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-scripts\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.169218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6xc5\" (UniqueName: \"kubernetes.io/projected/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-kube-api-access-s6xc5\") pod \"nova-cell0-cell-mapping-dhv8b\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.195537 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.197600 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.206669 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.215883 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.228288 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.230459 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.234168 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.240236 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.308747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.325768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.325872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.325904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-config-data\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.325939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-config-data\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.326036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdmrn\" (UniqueName: \"kubernetes.io/projected/e0970657-d202-4540-a15e-66bbf7878ff8-kube-api-access-vdmrn\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.326098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l95nv\" (UniqueName: \"kubernetes.io/projected/cd37ff46-765e-4543-ba21-e5bc7df64457-kube-api-access-l95nv\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.326121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0970657-d202-4540-a15e-66bbf7878ff8-logs\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.355160 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.440864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdmrn\" (UniqueName: \"kubernetes.io/projected/e0970657-d202-4540-a15e-66bbf7878ff8-kube-api-access-vdmrn\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.441189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l95nv\" (UniqueName: \"kubernetes.io/projected/cd37ff46-765e-4543-ba21-e5bc7df64457-kube-api-access-l95nv\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.441225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0970657-d202-4540-a15e-66bbf7878ff8-logs\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.441820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.442952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.443024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-config-data\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.443132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-config-data\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.445636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0970657-d202-4540-a15e-66bbf7878ff8-logs\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.468096 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.473860 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.490968 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.474886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-config-data\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.480118 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.485003 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-config-data\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.505223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l95nv\" (UniqueName: \"kubernetes.io/projected/cd37ff46-765e-4543-ba21-e5bc7df64457-kube-api-access-l95nv\") pod \"nova-scheduler-0\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.505955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdmrn\" (UniqueName: \"kubernetes.io/projected/e0970657-d202-4540-a15e-66bbf7878ff8-kube-api-access-vdmrn\") pod \"nova-api-0\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.518545 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.529241 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.573849 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.583889 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.612578 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.696898 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.708117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2knj7\" (UniqueName: \"kubernetes.io/projected/36178046-e97b-4886-b9e8-94df2cccdfbd-kube-api-access-2knj7\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.718620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.718724 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36178046-e97b-4886-b9e8-94df2cccdfbd-logs\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.718871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-config-data\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.792004 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.793514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.796351 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.821600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-config-data\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.822026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w8mm\" (UniqueName: \"kubernetes.io/projected/db2b8c43-d40b-4cc6-9b84-d9d43660af11-kube-api-access-2w8mm\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.828516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2knj7\" (UniqueName: \"kubernetes.io/projected/36178046-e97b-4886-b9e8-94df2cccdfbd-kube-api-access-2knj7\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.828652 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.828739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.829316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.829511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36178046-e97b-4886-b9e8-94df2cccdfbd-logs\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.830680 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-config-data\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.833511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36178046-e97b-4886-b9e8-94df2cccdfbd-logs\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.839130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.867914 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.874061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2knj7\" (UniqueName: \"kubernetes.io/projected/36178046-e97b-4886-b9e8-94df2cccdfbd-kube-api-access-2knj7\") pod \"nova-metadata-0\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " pod="openstack/nova-metadata-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.908585 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-mf4jk"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.911297 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.932488 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-config\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.932554 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w8mm\" (UniqueName: \"kubernetes.io/projected/db2b8c43-d40b-4cc6-9b84-d9d43660af11-kube-api-access-2w8mm\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.932630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-svc\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.932696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.932717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.932759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.932974 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.933029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld62j\" (UniqueName: \"kubernetes.io/projected/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-kube-api-access-ld62j\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.933131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.939562 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-mf4jk"] Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.951597 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.956881 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.966383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w8mm\" (UniqueName: \"kubernetes.io/projected/db2b8c43-d40b-4cc6-9b84-d9d43660af11-kube-api-access-2w8mm\") pod \"nova-cell1-novncproxy-0\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:57 crc kubenswrapper[4749]: I0128 19:00:57.991374 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.004396 4749 generic.go:334] "Generic (PLEG): container finished" podID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerID="c8c44fbc19b04c19f1560b66aa7bcea0e63364828b92ccfdf5c90020797a4386" exitCode=0 Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.005418 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5nqn" event={"ID":"7aa7fd2d-ff29-4928-9e51-54ce06ec0028","Type":"ContainerDied","Data":"c8c44fbc19b04c19f1560b66aa7bcea0e63364828b92ccfdf5c90020797a4386"} Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.035471 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.035547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld62j\" (UniqueName: \"kubernetes.io/projected/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-kube-api-access-ld62j\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.035595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.035635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-config\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.035680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-svc\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.035758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.036775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.037450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.039171 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-config\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.039440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.039722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-svc\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.041287 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcvq7"] Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.073627 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld62j\" (UniqueName: \"kubernetes.io/projected/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-kube-api-access-ld62j\") pod \"dnsmasq-dns-9b86998b5-mf4jk\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.157570 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.196050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dhv8b"] Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.248990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.547802 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.566591 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4r4v\" (UniqueName: \"kubernetes.io/projected/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-kube-api-access-p4r4v\") pod \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.566765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-catalog-content\") pod \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.566978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-utilities\") pod \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\" (UID: \"7aa7fd2d-ff29-4928-9e51-54ce06ec0028\") " Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.568227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-utilities" (OuterVolumeSpecName: "utilities") pod "7aa7fd2d-ff29-4928-9e51-54ce06ec0028" (UID: "7aa7fd2d-ff29-4928-9e51-54ce06ec0028"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.579021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-kube-api-access-p4r4v" (OuterVolumeSpecName: "kube-api-access-p4r4v") pod "7aa7fd2d-ff29-4928-9e51-54ce06ec0028" (UID: "7aa7fd2d-ff29-4928-9e51-54ce06ec0028"). InnerVolumeSpecName "kube-api-access-p4r4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.657919 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7aa7fd2d-ff29-4928-9e51-54ce06ec0028" (UID: "7aa7fd2d-ff29-4928-9e51-54ce06ec0028"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.670433 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.671391 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.671503 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4r4v\" (UniqueName: \"kubernetes.io/projected/7aa7fd2d-ff29-4928-9e51-54ce06ec0028-kube-api-access-p4r4v\") on node \"crc\" DevicePath \"\"" Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.716929 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.729015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:00:58 crc kubenswrapper[4749]: I0128 19:00:58.953736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.031666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dhv8b" event={"ID":"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe","Type":"ContainerStarted","Data":"b5cfbe0c5a127402e7ad2df9ec5c73613547fe684456155a29b15843ca5143eb"} Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.031721 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dhv8b" event={"ID":"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe","Type":"ContainerStarted","Data":"44f89da9f63b272d40ba735624e055116d0b191b1ebec980d9a9b16a146bde95"} Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.035864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0970657-d202-4540-a15e-66bbf7878ff8","Type":"ContainerStarted","Data":"f96ae341e7f003e84137a0119e791ba8afd4396c4d41fbdf958656854c1a22a3"} Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.039194 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5nqn" event={"ID":"7aa7fd2d-ff29-4928-9e51-54ce06ec0028","Type":"ContainerDied","Data":"1752c6305e2258b5d2282a29b405dea823a8c8600235132dc8af301d4a4face2"} Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.039248 4749 scope.go:117] "RemoveContainer" containerID="c8c44fbc19b04c19f1560b66aa7bcea0e63364828b92ccfdf5c90020797a4386" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.039448 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5nqn" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.041671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd37ff46-765e-4543-ba21-e5bc7df64457","Type":"ContainerStarted","Data":"726ab1a040ac2ecf80760fea79ba93d9b7ddcf5b258a600f7ef33c4a854cd393"} Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.056591 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rcvq7" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="registry-server" containerID="cri-o://4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9" gracePeriod=2 Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.056951 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36178046-e97b-4886-b9e8-94df2cccdfbd","Type":"ContainerStarted","Data":"b87c4eb433322bf13571ce39f3024497eff3d35fde06bff75771c2f2bc51793e"} Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.096124 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dhv8b" podStartSLOduration=3.096079509 podStartE2EDuration="3.096079509s" podCreationTimestamp="2026-01-28 19:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:00:59.057400668 +0000 UTC m=+1527.068927443" watchObservedRunningTime="2026-01-28 19:00:59.096079509 +0000 UTC m=+1527.107606284" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.113758 4749 scope.go:117] "RemoveContainer" containerID="eac3f4818b2555bbe2fb0ee7192790f4d7c1d60f9371f257a70952a42840ce08" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.177529 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.206641 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-mf4jk"] Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.220018 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5nqn"] Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.257942 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b5nqn"] Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.417833 4749 scope.go:117] "RemoveContainer" containerID="3b703fc79c1ddb2b535b71176efde7fc0fc40d0f4e617872bc1dbb8fc50d7c4c" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.458706 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwj5s"] Jan 28 19:00:59 crc kubenswrapper[4749]: E0128 19:00:59.460514 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="extract-utilities" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.460539 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="extract-utilities" Jan 28 19:00:59 crc kubenswrapper[4749]: E0128 19:00:59.460576 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="registry-server" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.460590 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="registry-server" Jan 28 19:00:59 crc kubenswrapper[4749]: E0128 19:00:59.460625 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="extract-content" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.460640 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="extract-content" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.461313 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" containerName="registry-server" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.469192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.475782 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.475913 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.511151 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwj5s"] Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.519876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-scripts\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.520005 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.520099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls75n\" (UniqueName: \"kubernetes.io/projected/acf9d598-9f13-42c7-9f82-9d6138a0d553-kube-api-access-ls75n\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.521908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-config-data\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.632519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-config-data\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.633756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-scripts\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.638196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.638301 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-config-data\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.641636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls75n\" (UniqueName: \"kubernetes.io/projected/acf9d598-9f13-42c7-9f82-9d6138a0d553-kube-api-access-ls75n\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.650734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-scripts\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.651509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.695733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls75n\" (UniqueName: \"kubernetes.io/projected/acf9d598-9f13-42c7-9f82-9d6138a0d553-kube-api-access-ls75n\") pod \"nova-cell1-conductor-db-sync-bwj5s\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.842698 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.930832 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.948114 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-utilities\") pod \"f24b8786-67ac-48d4-ac79-106722f66977\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.948233 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-catalog-content\") pod \"f24b8786-67ac-48d4-ac79-106722f66977\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.948391 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrvd6\" (UniqueName: \"kubernetes.io/projected/f24b8786-67ac-48d4-ac79-106722f66977-kube-api-access-jrvd6\") pod \"f24b8786-67ac-48d4-ac79-106722f66977\" (UID: \"f24b8786-67ac-48d4-ac79-106722f66977\") " Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.951447 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-utilities" (OuterVolumeSpecName: "utilities") pod "f24b8786-67ac-48d4-ac79-106722f66977" (UID: "f24b8786-67ac-48d4-ac79-106722f66977"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:00:59 crc kubenswrapper[4749]: I0128 19:00:59.961742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24b8786-67ac-48d4-ac79-106722f66977-kube-api-access-jrvd6" (OuterVolumeSpecName: "kube-api-access-jrvd6") pod "f24b8786-67ac-48d4-ac79-106722f66977" (UID: "f24b8786-67ac-48d4-ac79-106722f66977"). InnerVolumeSpecName "kube-api-access-jrvd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.023625 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f24b8786-67ac-48d4-ac79-106722f66977" (UID: "f24b8786-67ac-48d4-ac79-106722f66977"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.054898 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrvd6\" (UniqueName: \"kubernetes.io/projected/f24b8786-67ac-48d4-ac79-106722f66977-kube-api-access-jrvd6\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.054935 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.054944 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f24b8786-67ac-48d4-ac79-106722f66977-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.094601 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" containerID="33f39a50bdc71171f16ed7433a471255c2d2ca50978e9d46933bc80fdc56239c" exitCode=0 Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.094669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" event={"ID":"9ec54373-337d-4d3f-a3ae-1b5be5f892b4","Type":"ContainerDied","Data":"33f39a50bdc71171f16ed7433a471255c2d2ca50978e9d46933bc80fdc56239c"} Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.094697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" event={"ID":"9ec54373-337d-4d3f-a3ae-1b5be5f892b4","Type":"ContainerStarted","Data":"079985dc57a5ac338c003cf1ec6d21f818e7ea54bad88983a7ff898123c6a98b"} Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.110270 4749 generic.go:334] "Generic (PLEG): container finished" podID="f24b8786-67ac-48d4-ac79-106722f66977" containerID="4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9" exitCode=0 Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.110387 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcvq7" event={"ID":"f24b8786-67ac-48d4-ac79-106722f66977","Type":"ContainerDied","Data":"4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9"} Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.110421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcvq7" event={"ID":"f24b8786-67ac-48d4-ac79-106722f66977","Type":"ContainerDied","Data":"c1f2af4a10f91ba0da47a0ab463f9b92610df5ab8b57a8fdc0c2dbfdb5f55488"} Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.110439 4749 scope.go:117] "RemoveContainer" containerID="4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.110576 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcvq7" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.151402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db2b8c43-d40b-4cc6-9b84-d9d43660af11","Type":"ContainerStarted","Data":"5ecf908db1b83cec273a8e0519fd61604f9f3723e0f10f5f35cb8f3bbb7ab2a4"} Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.244494 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29493781-5zxzd"] Jan 28 19:01:00 crc kubenswrapper[4749]: E0128 19:01:00.245308 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="registry-server" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.245346 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="registry-server" Jan 28 19:01:00 crc kubenswrapper[4749]: E0128 19:01:00.245437 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="extract-content" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.245444 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="extract-content" Jan 28 19:01:00 crc kubenswrapper[4749]: E0128 19:01:00.245463 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="extract-utilities" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.245469 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="extract-utilities" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.245919 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24b8786-67ac-48d4-ac79-106722f66977" containerName="registry-server" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.251842 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.331045 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29493781-5zxzd"] Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.377943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-config-data\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.378013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-fernet-keys\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.378097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-combined-ca-bundle\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.378141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h8cp\" (UniqueName: \"kubernetes.io/projected/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-kube-api-access-7h8cp\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.392143 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcvq7"] Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.420927 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rcvq7"] Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.476825 4749 scope.go:117] "RemoveContainer" containerID="b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.484832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-config-data\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.484899 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-fernet-keys\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.484967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-combined-ca-bundle\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.484997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h8cp\" (UniqueName: \"kubernetes.io/projected/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-kube-api-access-7h8cp\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.491474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-fernet-keys\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.491473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-config-data\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.494089 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-combined-ca-bundle\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.503925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h8cp\" (UniqueName: \"kubernetes.io/projected/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-kube-api-access-7h8cp\") pod \"keystone-cron-29493781-5zxzd\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.642361 4749 scope.go:117] "RemoveContainer" containerID="06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.685919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.712595 4749 scope.go:117] "RemoveContainer" containerID="4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9" Jan 28 19:01:00 crc kubenswrapper[4749]: E0128 19:01:00.713054 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9\": container with ID starting with 4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9 not found: ID does not exist" containerID="4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.713090 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9"} err="failed to get container status \"4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9\": rpc error: code = NotFound desc = could not find container \"4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9\": container with ID starting with 4c6dab785f7b60f3adfcd57fa410afadc94af79dcee468e297a1c78541599fe9 not found: ID does not exist" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.713111 4749 scope.go:117] "RemoveContainer" containerID="b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29" Jan 28 19:01:00 crc kubenswrapper[4749]: E0128 19:01:00.717003 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29\": container with ID starting with b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29 not found: ID does not exist" containerID="b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.717075 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29"} err="failed to get container status \"b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29\": rpc error: code = NotFound desc = could not find container \"b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29\": container with ID starting with b23a890abce62504089f09f89393d6e4a3510d8425854224c01d8ab4377ebf29 not found: ID does not exist" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.717107 4749 scope.go:117] "RemoveContainer" containerID="06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0" Jan 28 19:01:00 crc kubenswrapper[4749]: E0128 19:01:00.718509 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0\": container with ID starting with 06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0 not found: ID does not exist" containerID="06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.718549 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0"} err="failed to get container status \"06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0\": rpc error: code = NotFound desc = could not find container \"06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0\": container with ID starting with 06dc98424ede0b0090b13b57928633305692b95f34c578d4063733001ca993b0 not found: ID does not exist" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.720543 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwj5s"] Jan 28 19:01:00 crc kubenswrapper[4749]: W0128 19:01:00.771450 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacf9d598_9f13_42c7_9f82_9d6138a0d553.slice/crio-5150d14ee08eedb8a32b732cc5fbae2ecc1205e156c3dcd0c2f4d6de3d5e7fb2 WatchSource:0}: Error finding container 5150d14ee08eedb8a32b732cc5fbae2ecc1205e156c3dcd0c2f4d6de3d5e7fb2: Status 404 returned error can't find the container with id 5150d14ee08eedb8a32b732cc5fbae2ecc1205e156c3dcd0c2f4d6de3d5e7fb2 Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.901002 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa7fd2d-ff29-4928-9e51-54ce06ec0028" path="/var/lib/kubelet/pods/7aa7fd2d-ff29-4928-9e51-54ce06ec0028/volumes" Jan 28 19:01:00 crc kubenswrapper[4749]: I0128 19:01:00.904410 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24b8786-67ac-48d4-ac79-106722f66977" path="/var/lib/kubelet/pods/f24b8786-67ac-48d4-ac79-106722f66977/volumes" Jan 28 19:01:01 crc kubenswrapper[4749]: I0128 19:01:01.188332 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:01:01 crc kubenswrapper[4749]: I0128 19:01:01.197917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" event={"ID":"9ec54373-337d-4d3f-a3ae-1b5be5f892b4","Type":"ContainerStarted","Data":"a9398e22e233fff5ed277aec93ec36e8fd610c432a1ce0b6b86871b0aca3f3d1"} Jan 28 19:01:01 crc kubenswrapper[4749]: I0128 19:01:01.199093 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:01:01 crc kubenswrapper[4749]: I0128 19:01:01.222749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" event={"ID":"acf9d598-9f13-42c7-9f82-9d6138a0d553","Type":"ContainerStarted","Data":"5150d14ee08eedb8a32b732cc5fbae2ecc1205e156c3dcd0c2f4d6de3d5e7fb2"} Jan 28 19:01:01 crc kubenswrapper[4749]: I0128 19:01:01.233188 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:01 crc kubenswrapper[4749]: I0128 19:01:01.238515 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" podStartSLOduration=4.23848239 podStartE2EDuration="4.23848239s" podCreationTimestamp="2026-01-28 19:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:01.223891057 +0000 UTC m=+1529.235417832" watchObservedRunningTime="2026-01-28 19:01:01.23848239 +0000 UTC m=+1529.250009165" Jan 28 19:01:01 crc kubenswrapper[4749]: I0128 19:01:01.590949 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29493781-5zxzd"] Jan 28 19:01:02 crc kubenswrapper[4749]: I0128 19:01:02.254337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493781-5zxzd" event={"ID":"60b19a5a-17c1-4990-af7b-f3636f9e1cd2","Type":"ContainerStarted","Data":"dbca978ae654cff75472b444ef112b4924c223401e797f9fa5ebb4e8ea58c2a2"} Jan 28 19:01:02 crc kubenswrapper[4749]: I0128 19:01:02.269520 4749 generic.go:334] "Generic (PLEG): container finished" podID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerID="4f9157dd50c612a7ce0631afec387a6a72dc11eb5d5a558ebdef1621723cfd9c" exitCode=0 Jan 28 19:01:02 crc kubenswrapper[4749]: I0128 19:01:02.269601 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerDied","Data":"4f9157dd50c612a7ce0631afec387a6a72dc11eb5d5a558ebdef1621723cfd9c"} Jan 28 19:01:02 crc kubenswrapper[4749]: I0128 19:01:02.273071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" event={"ID":"acf9d598-9f13-42c7-9f82-9d6138a0d553","Type":"ContainerStarted","Data":"352f9833b2a26acbbdd6d3e25705e4404bdb55f7587ec441f1b36f07b86495fc"} Jan 28 19:01:02 crc kubenswrapper[4749]: I0128 19:01:02.295909 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" podStartSLOduration=3.295886873 podStartE2EDuration="3.295886873s" podCreationTimestamp="2026-01-28 19:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:02.286089549 +0000 UTC m=+1530.297616324" watchObservedRunningTime="2026-01-28 19:01:02.295886873 +0000 UTC m=+1530.307413648" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.298303 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.357720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e08c55a7-aec7-463e-8567-6afe8c166fcb","Type":"ContainerDied","Data":"a5d143bc0a4c3ed117851ba51ee2c7ba41b3f212ad751f4340453dac70019bd9"} Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.357779 4749 scope.go:117] "RemoveContainer" containerID="e1b383e112ca3d31ce6fed2ad8e8e2d0d7bc1e67c9366b3a591db1ad2aed47ee" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.357914 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.415792 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlzmg\" (UniqueName: \"kubernetes.io/projected/e08c55a7-aec7-463e-8567-6afe8c166fcb-kube-api-access-jlzmg\") pod \"e08c55a7-aec7-463e-8567-6afe8c166fcb\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.415932 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-config-data\") pod \"e08c55a7-aec7-463e-8567-6afe8c166fcb\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.416098 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-sg-core-conf-yaml\") pod \"e08c55a7-aec7-463e-8567-6afe8c166fcb\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.416184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-combined-ca-bundle\") pod \"e08c55a7-aec7-463e-8567-6afe8c166fcb\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.416279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-scripts\") pod \"e08c55a7-aec7-463e-8567-6afe8c166fcb\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.416427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-run-httpd\") pod \"e08c55a7-aec7-463e-8567-6afe8c166fcb\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.416468 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-log-httpd\") pod \"e08c55a7-aec7-463e-8567-6afe8c166fcb\" (UID: \"e08c55a7-aec7-463e-8567-6afe8c166fcb\") " Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.417796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e08c55a7-aec7-463e-8567-6afe8c166fcb" (UID: "e08c55a7-aec7-463e-8567-6afe8c166fcb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.423356 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e08c55a7-aec7-463e-8567-6afe8c166fcb" (UID: "e08c55a7-aec7-463e-8567-6afe8c166fcb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.439243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-scripts" (OuterVolumeSpecName: "scripts") pod "e08c55a7-aec7-463e-8567-6afe8c166fcb" (UID: "e08c55a7-aec7-463e-8567-6afe8c166fcb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.449591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08c55a7-aec7-463e-8567-6afe8c166fcb-kube-api-access-jlzmg" (OuterVolumeSpecName: "kube-api-access-jlzmg") pod "e08c55a7-aec7-463e-8567-6afe8c166fcb" (UID: "e08c55a7-aec7-463e-8567-6afe8c166fcb"). InnerVolumeSpecName "kube-api-access-jlzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.485717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e08c55a7-aec7-463e-8567-6afe8c166fcb" (UID: "e08c55a7-aec7-463e-8567-6afe8c166fcb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.519625 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.519696 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlzmg\" (UniqueName: \"kubernetes.io/projected/e08c55a7-aec7-463e-8567-6afe8c166fcb-kube-api-access-jlzmg\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.519711 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.519721 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.519731 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e08c55a7-aec7-463e-8567-6afe8c166fcb-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.590270 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e08c55a7-aec7-463e-8567-6afe8c166fcb" (UID: "e08c55a7-aec7-463e-8567-6afe8c166fcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.634348 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.657583 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-config-data" (OuterVolumeSpecName: "config-data") pod "e08c55a7-aec7-463e-8567-6afe8c166fcb" (UID: "e08c55a7-aec7-463e-8567-6afe8c166fcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.718377 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.737964 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08c55a7-aec7-463e-8567-6afe8c166fcb-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.748458 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.782672 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:04 crc kubenswrapper[4749]: E0128 19:01:04.784059 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="ceilometer-central-agent" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.784081 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="ceilometer-central-agent" Jan 28 19:01:04 crc kubenswrapper[4749]: E0128 19:01:04.784093 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="ceilometer-notification-agent" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.784100 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="ceilometer-notification-agent" Jan 28 19:01:04 crc kubenswrapper[4749]: E0128 19:01:04.784132 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="proxy-httpd" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.784138 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="proxy-httpd" Jan 28 19:01:04 crc kubenswrapper[4749]: E0128 19:01:04.784157 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="sg-core" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.784163 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="sg-core" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.784380 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="ceilometer-central-agent" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.784400 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="sg-core" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.784415 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="proxy-httpd" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.784424 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" containerName="ceilometer-notification-agent" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.786476 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.790806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.790984 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.797591 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.899027 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08c55a7-aec7-463e-8567-6afe8c166fcb" path="/var/lib/kubelet/pods/e08c55a7-aec7-463e-8567-6afe8c166fcb/volumes" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.942653 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8qk\" (UniqueName: \"kubernetes.io/projected/451b80dc-7224-4bd0-a611-118ff9808edf-kube-api-access-zd8qk\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.942729 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-run-httpd\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.942883 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-log-httpd\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.942993 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.943057 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-scripts\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.943075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-config-data\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:04 crc kubenswrapper[4749]: I0128 19:01:04.943375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.045423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8qk\" (UniqueName: \"kubernetes.io/projected/451b80dc-7224-4bd0-a611-118ff9808edf-kube-api-access-zd8qk\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.045503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-run-httpd\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.045551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-log-httpd\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.045593 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.045623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-scripts\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.045637 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-config-data\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.046852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.047530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-log-httpd\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.047980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-run-httpd\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.052193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.052459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-scripts\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.054223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.056587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-config-data\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.068529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8qk\" (UniqueName: \"kubernetes.io/projected/451b80dc-7224-4bd0-a611-118ff9808edf-kube-api-access-zd8qk\") pod \"ceilometer-0\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.125518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.260296 4749 scope.go:117] "RemoveContainer" containerID="633f9ee9574627558a2d6b946515b84496c214efeb0f643a987ba3459f9cae95" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.319527 4749 scope.go:117] "RemoveContainer" containerID="07a457c0188557adfaa96cc458ff67b545a4585df69e282bea963e19f3662609" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.522787 4749 scope.go:117] "RemoveContainer" containerID="4f9157dd50c612a7ce0631afec387a6a72dc11eb5d5a558ebdef1621723cfd9c" Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.740941 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.755043 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.755354 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" containerName="nova-cell0-conductor-conductor" containerID="cri-o://ffa38e46c426004907460b26c4c2e56146d7e9a53299a6a7c4a9382ee037a0de" gracePeriod=30 Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.782370 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:05 crc kubenswrapper[4749]: I0128 19:01:05.937931 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:05 crc kubenswrapper[4749]: W0128 19:01:05.967566 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451b80dc_7224_4bd0_a611_118ff9808edf.slice/crio-a8f5f1e947619978bbf4cf76d7053602290717200b3f63f969ec67af157ab908 WatchSource:0}: Error finding container a8f5f1e947619978bbf4cf76d7053602290717200b3f63f969ec67af157ab908: Status 404 returned error can't find the container with id a8f5f1e947619978bbf4cf76d7053602290717200b3f63f969ec67af157ab908 Jan 28 19:01:06 crc kubenswrapper[4749]: E0128 19:01:06.266693 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ffa38e46c426004907460b26c4c2e56146d7e9a53299a6a7c4a9382ee037a0de" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 28 19:01:06 crc kubenswrapper[4749]: E0128 19:01:06.277769 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ffa38e46c426004907460b26c4c2e56146d7e9a53299a6a7c4a9382ee037a0de" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 28 19:01:06 crc kubenswrapper[4749]: E0128 19:01:06.281791 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ffa38e46c426004907460b26c4c2e56146d7e9a53299a6a7c4a9382ee037a0de" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 28 19:01:06 crc kubenswrapper[4749]: E0128 19:01:06.281868 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" containerName="nova-cell0-conductor-conductor" Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.439707 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36178046-e97b-4886-b9e8-94df2cccdfbd","Type":"ContainerStarted","Data":"56af00f306db462064bc4cbd5e26a936b1788dcce7c333c374c669cd86036078"} Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.441994 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0970657-d202-4540-a15e-66bbf7878ff8","Type":"ContainerStarted","Data":"d253f2834a2ad1ad365e6db739778848d75206c4def31c155b4915d51657b284"} Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.443734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493781-5zxzd" event={"ID":"60b19a5a-17c1-4990-af7b-f3636f9e1cd2","Type":"ContainerStarted","Data":"72e7d9f2150fe25a409fa5351b3eb4425c48ecf498b035dca8e20aa2206c40ec"} Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.449042 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cd37ff46-765e-4543-ba21-e5bc7df64457" containerName="nova-scheduler-scheduler" containerID="cri-o://f0cd6422e11d34a5546dd497c9e24100996cb616afe6a832cd8df16274a019a9" gracePeriod=30 Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.449215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd37ff46-765e-4543-ba21-e5bc7df64457","Type":"ContainerStarted","Data":"f0cd6422e11d34a5546dd497c9e24100996cb616afe6a832cd8df16274a019a9"} Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.455304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerStarted","Data":"a8f5f1e947619978bbf4cf76d7053602290717200b3f63f969ec67af157ab908"} Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.464188 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29493781-5zxzd" podStartSLOduration=6.4641710549999996 podStartE2EDuration="6.464171055s" podCreationTimestamp="2026-01-28 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:06.461216492 +0000 UTC m=+1534.472743297" watchObservedRunningTime="2026-01-28 19:01:06.464171055 +0000 UTC m=+1534.475697830" Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.467680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db2b8c43-d40b-4cc6-9b84-d9d43660af11","Type":"ContainerStarted","Data":"ef7cf3b768ab072ae74ed4e2a7429c5d8bb818e1c5fc9d696b07e3b417eb0226"} Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.467846 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="db2b8c43-d40b-4cc6-9b84-d9d43660af11" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ef7cf3b768ab072ae74ed4e2a7429c5d8bb818e1c5fc9d696b07e3b417eb0226" gracePeriod=30 Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.497964 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.859076885 podStartE2EDuration="9.497942345s" podCreationTimestamp="2026-01-28 19:00:57 +0000 UTC" firstStartedPulling="2026-01-28 19:00:58.710865754 +0000 UTC m=+1526.722392529" lastFinishedPulling="2026-01-28 19:01:05.349731214 +0000 UTC m=+1533.361257989" observedRunningTime="2026-01-28 19:01:06.477939448 +0000 UTC m=+1534.489466233" watchObservedRunningTime="2026-01-28 19:01:06.497942345 +0000 UTC m=+1534.509469130" Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.512367 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.321538622 podStartE2EDuration="9.512345103s" podCreationTimestamp="2026-01-28 19:00:57 +0000 UTC" firstStartedPulling="2026-01-28 19:00:59.166127252 +0000 UTC m=+1527.177654027" lastFinishedPulling="2026-01-28 19:01:05.356933733 +0000 UTC m=+1533.368460508" observedRunningTime="2026-01-28 19:01:06.508808036 +0000 UTC m=+1534.520334841" watchObservedRunningTime="2026-01-28 19:01:06.512345103 +0000 UTC m=+1534.523871878" Jan 28 19:01:06 crc kubenswrapper[4749]: I0128 19:01:06.990100 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 19:01:06 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:01:06 crc kubenswrapper[4749]: > Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.481253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36178046-e97b-4886-b9e8-94df2cccdfbd","Type":"ContainerStarted","Data":"635e8023c61207bd40d178bedb7341dd7cc5c363f4d9571e47b5ce4273215efc"} Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.481962 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerName="nova-metadata-metadata" containerID="cri-o://635e8023c61207bd40d178bedb7341dd7cc5c363f4d9571e47b5ce4273215efc" gracePeriod=30 Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.481919 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerName="nova-metadata-log" containerID="cri-o://56af00f306db462064bc4cbd5e26a936b1788dcce7c333c374c669cd86036078" gracePeriod=30 Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.488920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0970657-d202-4540-a15e-66bbf7878ff8","Type":"ContainerStarted","Data":"74b48a176aeac369fde85659a406ab80141425bd46303539a0a6a2582128f77c"} Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.489109 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" containerName="nova-api-log" containerID="cri-o://d253f2834a2ad1ad365e6db739778848d75206c4def31c155b4915d51657b284" gracePeriod=30 Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.489215 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" containerName="nova-api-api" containerID="cri-o://74b48a176aeac369fde85659a406ab80141425bd46303539a0a6a2582128f77c" gracePeriod=30 Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.496448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerStarted","Data":"4f17aa195430061da8b0d8b496f51e05ef3911aea5a8bb99bf4399e8da6048ec"} Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.496497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerStarted","Data":"b3695c28d70a4698dc7cc2af278f555f840365b853c685e11dfd9672c7154747"} Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.519435 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.527101 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.12914015 podStartE2EDuration="10.527077764s" podCreationTimestamp="2026-01-28 19:00:57 +0000 UTC" firstStartedPulling="2026-01-28 19:00:58.959655675 +0000 UTC m=+1526.971182450" lastFinishedPulling="2026-01-28 19:01:05.357593289 +0000 UTC m=+1533.369120064" observedRunningTime="2026-01-28 19:01:07.504455331 +0000 UTC m=+1535.515982116" watchObservedRunningTime="2026-01-28 19:01:07.527077764 +0000 UTC m=+1535.538604539" Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.547749 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.884713008 podStartE2EDuration="10.547728088s" podCreationTimestamp="2026-01-28 19:00:57 +0000 UTC" firstStartedPulling="2026-01-28 19:00:58.696870346 +0000 UTC m=+1526.708397121" lastFinishedPulling="2026-01-28 19:01:05.359885426 +0000 UTC m=+1533.371412201" observedRunningTime="2026-01-28 19:01:07.541645376 +0000 UTC m=+1535.553172161" watchObservedRunningTime="2026-01-28 19:01:07.547728088 +0000 UTC m=+1535.559254873" Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.952967 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.992085 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 19:01:07 crc kubenswrapper[4749]: I0128 19:01:07.992161 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.159927 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.252639 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.341880 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-6xh7w"] Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.342152 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" podUID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerName="dnsmasq-dns" containerID="cri-o://2812fb5cb1f4918f47fd7f8b8d0f8a7818946b031b3fed1d58f86293259f36c6" gracePeriod=10 Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.416916 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" podUID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.207:5353: connect: connection refused" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.577569 4749 generic.go:334] "Generic (PLEG): container finished" podID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerID="635e8023c61207bd40d178bedb7341dd7cc5c363f4d9571e47b5ce4273215efc" exitCode=0 Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.578013 4749 generic.go:334] "Generic (PLEG): container finished" podID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerID="56af00f306db462064bc4cbd5e26a936b1788dcce7c333c374c669cd86036078" exitCode=143 Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.578088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36178046-e97b-4886-b9e8-94df2cccdfbd","Type":"ContainerDied","Data":"635e8023c61207bd40d178bedb7341dd7cc5c363f4d9571e47b5ce4273215efc"} Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.578123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36178046-e97b-4886-b9e8-94df2cccdfbd","Type":"ContainerDied","Data":"56af00f306db462064bc4cbd5e26a936b1788dcce7c333c374c669cd86036078"} Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.619758 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0970657-d202-4540-a15e-66bbf7878ff8" containerID="74b48a176aeac369fde85659a406ab80141425bd46303539a0a6a2582128f77c" exitCode=0 Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.619802 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0970657-d202-4540-a15e-66bbf7878ff8" containerID="d253f2834a2ad1ad365e6db739778848d75206c4def31c155b4915d51657b284" exitCode=143 Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.619889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0970657-d202-4540-a15e-66bbf7878ff8","Type":"ContainerDied","Data":"74b48a176aeac369fde85659a406ab80141425bd46303539a0a6a2582128f77c"} Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.619926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0970657-d202-4540-a15e-66bbf7878ff8","Type":"ContainerDied","Data":"d253f2834a2ad1ad365e6db739778848d75206c4def31c155b4915d51657b284"} Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.645611 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerID="2812fb5cb1f4918f47fd7f8b8d0f8a7818946b031b3fed1d58f86293259f36c6" exitCode=0 Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.645739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" event={"ID":"f9ffa10e-04c0-4b2f-91ce-6614d405034a","Type":"ContainerDied","Data":"2812fb5cb1f4918f47fd7f8b8d0f8a7818946b031b3fed1d58f86293259f36c6"} Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.670413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerStarted","Data":"ad2c5fd7a27007c73dbaf1cbff333a1334947b6ae4e234ae094600ecb3f759b7"} Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.858831 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.878593 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.890223 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdmrn\" (UniqueName: \"kubernetes.io/projected/e0970657-d202-4540-a15e-66bbf7878ff8-kube-api-access-vdmrn\") pod \"e0970657-d202-4540-a15e-66bbf7878ff8\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.890301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-config-data\") pod \"36178046-e97b-4886-b9e8-94df2cccdfbd\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.890371 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2knj7\" (UniqueName: \"kubernetes.io/projected/36178046-e97b-4886-b9e8-94df2cccdfbd-kube-api-access-2knj7\") pod \"36178046-e97b-4886-b9e8-94df2cccdfbd\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.890390 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0970657-d202-4540-a15e-66bbf7878ff8-logs\") pod \"e0970657-d202-4540-a15e-66bbf7878ff8\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.890419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-combined-ca-bundle\") pod \"36178046-e97b-4886-b9e8-94df2cccdfbd\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.891639 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0970657-d202-4540-a15e-66bbf7878ff8-logs" (OuterVolumeSpecName: "logs") pod "e0970657-d202-4540-a15e-66bbf7878ff8" (UID: "e0970657-d202-4540-a15e-66bbf7878ff8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.899122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0970657-d202-4540-a15e-66bbf7878ff8-kube-api-access-vdmrn" (OuterVolumeSpecName: "kube-api-access-vdmrn") pod "e0970657-d202-4540-a15e-66bbf7878ff8" (UID: "e0970657-d202-4540-a15e-66bbf7878ff8"). InnerVolumeSpecName "kube-api-access-vdmrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.901455 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36178046-e97b-4886-b9e8-94df2cccdfbd-kube-api-access-2knj7" (OuterVolumeSpecName: "kube-api-access-2knj7") pod "36178046-e97b-4886-b9e8-94df2cccdfbd" (UID: "36178046-e97b-4886-b9e8-94df2cccdfbd"). InnerVolumeSpecName "kube-api-access-2knj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.937256 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-config-data" (OuterVolumeSpecName: "config-data") pod "36178046-e97b-4886-b9e8-94df2cccdfbd" (UID: "36178046-e97b-4886-b9e8-94df2cccdfbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.942107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36178046-e97b-4886-b9e8-94df2cccdfbd" (UID: "36178046-e97b-4886-b9e8-94df2cccdfbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.994479 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-combined-ca-bundle\") pod \"e0970657-d202-4540-a15e-66bbf7878ff8\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.994574 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-config-data\") pod \"e0970657-d202-4540-a15e-66bbf7878ff8\" (UID: \"e0970657-d202-4540-a15e-66bbf7878ff8\") " Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.994603 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36178046-e97b-4886-b9e8-94df2cccdfbd-logs\") pod \"36178046-e97b-4886-b9e8-94df2cccdfbd\" (UID: \"36178046-e97b-4886-b9e8-94df2cccdfbd\") " Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.995476 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdmrn\" (UniqueName: \"kubernetes.io/projected/e0970657-d202-4540-a15e-66bbf7878ff8-kube-api-access-vdmrn\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.995504 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.995517 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2knj7\" (UniqueName: \"kubernetes.io/projected/36178046-e97b-4886-b9e8-94df2cccdfbd-kube-api-access-2knj7\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.995532 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0970657-d202-4540-a15e-66bbf7878ff8-logs\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.995543 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36178046-e97b-4886-b9e8-94df2cccdfbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:08 crc kubenswrapper[4749]: I0128 19:01:08.995695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36178046-e97b-4886-b9e8-94df2cccdfbd-logs" (OuterVolumeSpecName: "logs") pod "36178046-e97b-4886-b9e8-94df2cccdfbd" (UID: "36178046-e97b-4886-b9e8-94df2cccdfbd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.040581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0970657-d202-4540-a15e-66bbf7878ff8" (UID: "e0970657-d202-4540-a15e-66bbf7878ff8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.072014 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-config-data" (OuterVolumeSpecName: "config-data") pod "e0970657-d202-4540-a15e-66bbf7878ff8" (UID: "e0970657-d202-4540-a15e-66bbf7878ff8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.098434 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.098472 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0970657-d202-4540-a15e-66bbf7878ff8-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.098485 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36178046-e97b-4886-b9e8-94df2cccdfbd-logs\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.328378 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.507469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-svc\") pod \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.507576 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-swift-storage-0\") pod \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.507614 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-nb\") pod \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.507753 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cl5r\" (UniqueName: \"kubernetes.io/projected/f9ffa10e-04c0-4b2f-91ce-6614d405034a-kube-api-access-4cl5r\") pod \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.507909 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-sb\") pod \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.507925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-config\") pod \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\" (UID: \"f9ffa10e-04c0-4b2f-91ce-6614d405034a\") " Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.520706 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ffa10e-04c0-4b2f-91ce-6614d405034a-kube-api-access-4cl5r" (OuterVolumeSpecName: "kube-api-access-4cl5r") pod "f9ffa10e-04c0-4b2f-91ce-6614d405034a" (UID: "f9ffa10e-04c0-4b2f-91ce-6614d405034a"). InnerVolumeSpecName "kube-api-access-4cl5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.611706 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cl5r\" (UniqueName: \"kubernetes.io/projected/f9ffa10e-04c0-4b2f-91ce-6614d405034a-kube-api-access-4cl5r\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.621206 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9ffa10e-04c0-4b2f-91ce-6614d405034a" (UID: "f9ffa10e-04c0-4b2f-91ce-6614d405034a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.642522 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9ffa10e-04c0-4b2f-91ce-6614d405034a" (UID: "f9ffa10e-04c0-4b2f-91ce-6614d405034a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.696799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-config" (OuterVolumeSpecName: "config") pod "f9ffa10e-04c0-4b2f-91ce-6614d405034a" (UID: "f9ffa10e-04c0-4b2f-91ce-6614d405034a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.710851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9ffa10e-04c0-4b2f-91ce-6614d405034a" (UID: "f9ffa10e-04c0-4b2f-91ce-6614d405034a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.713345 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.713381 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-config\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.713391 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.713400 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.717761 4749 generic.go:334] "Generic (PLEG): container finished" podID="cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" containerID="ffa38e46c426004907460b26c4c2e56146d7e9a53299a6a7c4a9382ee037a0de" exitCode=0 Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.717987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec","Type":"ContainerDied","Data":"ffa38e46c426004907460b26c4c2e56146d7e9a53299a6a7c4a9382ee037a0de"} Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.750502 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.752228 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36178046-e97b-4886-b9e8-94df2cccdfbd","Type":"ContainerDied","Data":"b87c4eb433322bf13571ce39f3024497eff3d35fde06bff75771c2f2bc51793e"} Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.752293 4749 scope.go:117] "RemoveContainer" containerID="635e8023c61207bd40d178bedb7341dd7cc5c363f4d9571e47b5ce4273215efc" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.761908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9ffa10e-04c0-4b2f-91ce-6614d405034a" (UID: "f9ffa10e-04c0-4b2f-91ce-6614d405034a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.772446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0970657-d202-4540-a15e-66bbf7878ff8","Type":"ContainerDied","Data":"f96ae341e7f003e84137a0119e791ba8afd4396c4d41fbdf958656854c1a22a3"} Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.772546 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.799040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" event={"ID":"f9ffa10e-04c0-4b2f-91ce-6614d405034a","Type":"ContainerDied","Data":"d9f199129eaadcbea393fb11f82c51995366a8f5479cfe1f3f0d0be8f07ad82e"} Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.799156 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-6xh7w" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.833641 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.834285 4749 scope.go:117] "RemoveContainer" containerID="56af00f306db462064bc4cbd5e26a936b1788dcce7c333c374c669cd86036078" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.876142 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ffa10e-04c0-4b2f-91ce-6614d405034a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.882454 4749 scope.go:117] "RemoveContainer" containerID="74b48a176aeac369fde85659a406ab80141425bd46303539a0a6a2582128f77c" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.957822 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.975515 4749 scope.go:117] "RemoveContainer" containerID="d253f2834a2ad1ad365e6db739778848d75206c4def31c155b4915d51657b284" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.981041 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:09 crc kubenswrapper[4749]: E0128 19:01:09.981589 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerName="dnsmasq-dns" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.981627 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerName="dnsmasq-dns" Jan 28 19:01:09 crc kubenswrapper[4749]: E0128 19:01:09.981669 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" containerName="nova-api-api" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.981696 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" containerName="nova-api-api" Jan 28 19:01:09 crc kubenswrapper[4749]: E0128 19:01:09.981712 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" containerName="nova-api-log" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.981718 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" containerName="nova-api-log" Jan 28 19:01:09 crc kubenswrapper[4749]: E0128 19:01:09.981725 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerName="nova-metadata-metadata" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.981731 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerName="nova-metadata-metadata" Jan 28 19:01:09 crc kubenswrapper[4749]: E0128 19:01:09.981743 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerName="init" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.981769 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerName="init" Jan 28 19:01:09 crc kubenswrapper[4749]: E0128 19:01:09.981793 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerName="nova-metadata-log" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.981799 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerName="nova-metadata-log" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.982067 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerName="nova-metadata-log" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.982108 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" containerName="nova-api-api" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.982118 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" containerName="nova-api-log" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.982137 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" containerName="dnsmasq-dns" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.982168 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" containerName="nova-metadata-metadata" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.983911 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.988023 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 28 19:01:09 crc kubenswrapper[4749]: I0128 19:01:09.988380 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.016708 4749 scope.go:117] "RemoveContainer" containerID="2812fb5cb1f4918f47fd7f8b8d0f8a7818946b031b3fed1d58f86293259f36c6" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.022033 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.049429 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-6xh7w"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.070126 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-6xh7w"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.080552 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdtt\" (UniqueName: \"kubernetes.io/projected/f7abe3a3-c49a-46fe-9f47-2099f07a3418-kube-api-access-lkdtt\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.080614 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.080722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7abe3a3-c49a-46fe-9f47-2099f07a3418-logs\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.080787 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.080824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-config-data\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.081922 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.093262 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.105153 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.107684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.110038 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.136015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.136908 4749 scope.go:117] "RemoveContainer" containerID="bc8606ef844ee2362c996468fd9e09f9cff780b5d59867e6360add18cd1a587a" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.185643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdtt\" (UniqueName: \"kubernetes.io/projected/f7abe3a3-c49a-46fe-9f47-2099f07a3418-kube-api-access-lkdtt\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.185701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.185755 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.185820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-config-data\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.185842 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7abe3a3-c49a-46fe-9f47-2099f07a3418-logs\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.185888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-logs\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.185905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.185932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-config-data\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.186006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m2zt\" (UniqueName: \"kubernetes.io/projected/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-kube-api-access-8m2zt\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.187442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7abe3a3-c49a-46fe-9f47-2099f07a3418-logs\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.193767 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.196535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-config-data\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.200364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.209998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdtt\" (UniqueName: \"kubernetes.io/projected/f7abe3a3-c49a-46fe-9f47-2099f07a3418-kube-api-access-lkdtt\") pod \"nova-metadata-0\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.288365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m2zt\" (UniqueName: \"kubernetes.io/projected/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-kube-api-access-8m2zt\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.288821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.289050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-config-data\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.289257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-logs\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.289871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-logs\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.293239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.293314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-config-data\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.305389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.319685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m2zt\" (UniqueName: \"kubernetes.io/projected/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-kube-api-access-8m2zt\") pod \"nova-api-0\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.425753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.433694 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.494848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blf4z\" (UniqueName: \"kubernetes.io/projected/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-kube-api-access-blf4z\") pod \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.494918 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-config-data\") pod \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.495054 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-combined-ca-bundle\") pod \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\" (UID: \"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec\") " Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.508291 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-kube-api-access-blf4z" (OuterVolumeSpecName: "kube-api-access-blf4z") pod "cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" (UID: "cffbcc50-b658-4bc3-afcb-b037f0a4b6ec"). InnerVolumeSpecName "kube-api-access-blf4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.547735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" (UID: "cffbcc50-b658-4bc3-afcb-b037f0a4b6ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.559362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-config-data" (OuterVolumeSpecName: "config-data") pod "cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" (UID: "cffbcc50-b658-4bc3-afcb-b037f0a4b6ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.598969 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.599006 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.599018 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blf4z\" (UniqueName: \"kubernetes.io/projected/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec-kube-api-access-blf4z\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.825769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cffbcc50-b658-4bc3-afcb-b037f0a4b6ec","Type":"ContainerDied","Data":"6db9b9bd7566ce99177d22bb4c98ee5fc9228f8e69d622df66d724371f6ea1d2"} Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.825827 4749 scope.go:117] "RemoveContainer" containerID="ffa38e46c426004907460b26c4c2e56146d7e9a53299a6a7c4a9382ee037a0de" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.826059 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.920604 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36178046-e97b-4886-b9e8-94df2cccdfbd" path="/var/lib/kubelet/pods/36178046-e97b-4886-b9e8-94df2cccdfbd/volumes" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.926733 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0970657-d202-4540-a15e-66bbf7878ff8" path="/var/lib/kubelet/pods/e0970657-d202-4540-a15e-66bbf7878ff8/volumes" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.927825 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ffa10e-04c0-4b2f-91ce-6614d405034a" path="/var/lib/kubelet/pods/f9ffa10e-04c0-4b2f-91ce-6614d405034a/volumes" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.928615 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.928666 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.960376 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:01:10 crc kubenswrapper[4749]: E0128 19:01:10.961009 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" containerName="nova-cell0-conductor-conductor" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.961037 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" containerName="nova-cell0-conductor-conductor" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.961313 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" containerName="nova-cell0-conductor-conductor" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.962272 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.964524 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 19:01:10 crc kubenswrapper[4749]: I0128 19:01:10.991144 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.002762 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.107043 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.119764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77653c2e-ff64-433d-aa7a-64d8dcab8eca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.119847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77653c2e-ff64-433d-aa7a-64d8dcab8eca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.123349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-225nd\" (UniqueName: \"kubernetes.io/projected/77653c2e-ff64-433d-aa7a-64d8dcab8eca-kube-api-access-225nd\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.225504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77653c2e-ff64-433d-aa7a-64d8dcab8eca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.225661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-225nd\" (UniqueName: \"kubernetes.io/projected/77653c2e-ff64-433d-aa7a-64d8dcab8eca-kube-api-access-225nd\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.225837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77653c2e-ff64-433d-aa7a-64d8dcab8eca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.232609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77653c2e-ff64-433d-aa7a-64d8dcab8eca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.233217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77653c2e-ff64-433d-aa7a-64d8dcab8eca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.248635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-225nd\" (UniqueName: \"kubernetes.io/projected/77653c2e-ff64-433d-aa7a-64d8dcab8eca-kube-api-access-225nd\") pod \"nova-cell0-conductor-0\" (UID: \"77653c2e-ff64-433d-aa7a-64d8dcab8eca\") " pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.287391 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.899816 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 19:01:11 crc kubenswrapper[4749]: W0128 19:01:11.932699 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77653c2e_ff64_433d_aa7a_64d8dcab8eca.slice/crio-d8737df4dfc1d9a2874fa16784dfbc4e449cc7c46565ae3674ba5e680e8cd28e WatchSource:0}: Error finding container d8737df4dfc1d9a2874fa16784dfbc4e449cc7c46565ae3674ba5e680e8cd28e: Status 404 returned error can't find the container with id d8737df4dfc1d9a2874fa16784dfbc4e449cc7c46565ae3674ba5e680e8cd28e Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.940765 4749 generic.go:334] "Generic (PLEG): container finished" podID="60b19a5a-17c1-4990-af7b-f3636f9e1cd2" containerID="72e7d9f2150fe25a409fa5351b3eb4425c48ecf498b035dca8e20aa2206c40ec" exitCode=0 Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.940824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493781-5zxzd" event={"ID":"60b19a5a-17c1-4990-af7b-f3636f9e1cd2","Type":"ContainerDied","Data":"72e7d9f2150fe25a409fa5351b3eb4425c48ecf498b035dca8e20aa2206c40ec"} Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.955267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerStarted","Data":"f3bfa85f0aa5c7e97173869e6b40a4b14a92831e4e164d3aeff4f744968a4006"} Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.955470 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.955492 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="ceilometer-central-agent" containerID="cri-o://b3695c28d70a4698dc7cc2af278f555f840365b853c685e11dfd9672c7154747" gracePeriod=30 Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.955560 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="sg-core" containerID="cri-o://ad2c5fd7a27007c73dbaf1cbff333a1334947b6ae4e234ae094600ecb3f759b7" gracePeriod=30 Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.955601 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="ceilometer-notification-agent" containerID="cri-o://4f17aa195430061da8b0d8b496f51e05ef3911aea5a8bb99bf4399e8da6048ec" gracePeriod=30 Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.955569 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="proxy-httpd" containerID="cri-o://f3bfa85f0aa5c7e97173869e6b40a4b14a92831e4e164d3aeff4f744968a4006" gracePeriod=30 Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.958929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c60b12f-ab80-4d1e-957f-15d57d5cd36c","Type":"ContainerStarted","Data":"c9e30660453dfce533b916786481a5960055edee0c9971434aa5ccd60d84c0d9"} Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.958978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c60b12f-ab80-4d1e-957f-15d57d5cd36c","Type":"ContainerStarted","Data":"2f13536a529566499b821275d2c40c6ed50592debe426cf6687576a90fa82bd3"} Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.976250 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7abe3a3-c49a-46fe-9f47-2099f07a3418","Type":"ContainerStarted","Data":"d7e7a8c5ca28a2eb850d3ff75ac33caee033f618897a27307d2c224fbb9011db"} Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.976304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7abe3a3-c49a-46fe-9f47-2099f07a3418","Type":"ContainerStarted","Data":"515640c2b0f3d7f6e97e1a924d99d8a9b139e44a8feff59afc319252555c7277"} Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.981544 4749 generic.go:334] "Generic (PLEG): container finished" podID="e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" containerID="b5cfbe0c5a127402e7ad2df9ec5c73613547fe684456155a29b15843ca5143eb" exitCode=0 Jan 28 19:01:11 crc kubenswrapper[4749]: I0128 19:01:11.981591 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dhv8b" event={"ID":"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe","Type":"ContainerDied","Data":"b5cfbe0c5a127402e7ad2df9ec5c73613547fe684456155a29b15843ca5143eb"} Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.021086 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.598308977 podStartE2EDuration="8.021047229s" podCreationTimestamp="2026-01-28 19:01:04 +0000 UTC" firstStartedPulling="2026-01-28 19:01:05.969851095 +0000 UTC m=+1533.981377870" lastFinishedPulling="2026-01-28 19:01:10.392589347 +0000 UTC m=+1538.404116122" observedRunningTime="2026-01-28 19:01:11.997528925 +0000 UTC m=+1540.009055720" watchObservedRunningTime="2026-01-28 19:01:12.021047229 +0000 UTC m=+1540.032574014" Jan 28 19:01:12 crc kubenswrapper[4749]: E0128 19:01:12.104149 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451b80dc_7224_4bd0_a611_118ff9808edf.slice/crio-ad2c5fd7a27007c73dbaf1cbff333a1334947b6ae4e234ae094600ecb3f759b7.scope\": RecentStats: unable to find data in memory cache]" Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.885237 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffbcc50-b658-4bc3-afcb-b037f0a4b6ec" path="/var/lib/kubelet/pods/cffbcc50-b658-4bc3-afcb-b037f0a4b6ec/volumes" Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.993795 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"77653c2e-ff64-433d-aa7a-64d8dcab8eca","Type":"ContainerStarted","Data":"fe80df83443996f93748708c9744d111565b7b44f4c47ad70fc6fb0378779eb9"} Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.993848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"77653c2e-ff64-433d-aa7a-64d8dcab8eca","Type":"ContainerStarted","Data":"d8737df4dfc1d9a2874fa16784dfbc4e449cc7c46565ae3674ba5e680e8cd28e"} Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.993944 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.997365 4749 generic.go:334] "Generic (PLEG): container finished" podID="451b80dc-7224-4bd0-a611-118ff9808edf" containerID="f3bfa85f0aa5c7e97173869e6b40a4b14a92831e4e164d3aeff4f744968a4006" exitCode=0 Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.997398 4749 generic.go:334] "Generic (PLEG): container finished" podID="451b80dc-7224-4bd0-a611-118ff9808edf" containerID="ad2c5fd7a27007c73dbaf1cbff333a1334947b6ae4e234ae094600ecb3f759b7" exitCode=2 Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.997410 4749 generic.go:334] "Generic (PLEG): container finished" podID="451b80dc-7224-4bd0-a611-118ff9808edf" containerID="4f17aa195430061da8b0d8b496f51e05ef3911aea5a8bb99bf4399e8da6048ec" exitCode=0 Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.997462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerDied","Data":"f3bfa85f0aa5c7e97173869e6b40a4b14a92831e4e164d3aeff4f744968a4006"} Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.997484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerDied","Data":"ad2c5fd7a27007c73dbaf1cbff333a1334947b6ae4e234ae094600ecb3f759b7"} Jan 28 19:01:12 crc kubenswrapper[4749]: I0128 19:01:12.997498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerDied","Data":"4f17aa195430061da8b0d8b496f51e05ef3911aea5a8bb99bf4399e8da6048ec"} Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:12.999323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c60b12f-ab80-4d1e-957f-15d57d5cd36c","Type":"ContainerStarted","Data":"de308d673e6252583bd077d5d179053b8ec39baa0df88576dd2454a52bebc395"} Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.007836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7abe3a3-c49a-46fe-9f47-2099f07a3418","Type":"ContainerStarted","Data":"275062d2b99a8ebb8541ea4f63293aece8a7b01e006d8b2259a867e78c05f92b"} Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.030873 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.030854048 podStartE2EDuration="3.030854048s" podCreationTimestamp="2026-01-28 19:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:13.012974813 +0000 UTC m=+1541.024501608" watchObservedRunningTime="2026-01-28 19:01:13.030854048 +0000 UTC m=+1541.042380823" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.054410 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.054392023 podStartE2EDuration="4.054392023s" podCreationTimestamp="2026-01-28 19:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:13.037798751 +0000 UTC m=+1541.049325546" watchObservedRunningTime="2026-01-28 19:01:13.054392023 +0000 UTC m=+1541.065918798" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.072595 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.072533344 podStartE2EDuration="4.072533344s" podCreationTimestamp="2026-01-28 19:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:13.060524976 +0000 UTC m=+1541.072051741" watchObservedRunningTime="2026-01-28 19:01:13.072533344 +0000 UTC m=+1541.084060119" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.615517 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.621203 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.709054 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-config-data\") pod \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.709123 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6xc5\" (UniqueName: \"kubernetes.io/projected/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-kube-api-access-s6xc5\") pod \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.709245 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-config-data\") pod \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.709273 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h8cp\" (UniqueName: \"kubernetes.io/projected/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-kube-api-access-7h8cp\") pod \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.709444 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-combined-ca-bundle\") pod \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.709469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-combined-ca-bundle\") pod \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.709577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-fernet-keys\") pod \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\" (UID: \"60b19a5a-17c1-4990-af7b-f3636f9e1cd2\") " Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.709592 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-scripts\") pod \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\" (UID: \"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe\") " Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.716601 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-scripts" (OuterVolumeSpecName: "scripts") pod "e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" (UID: "e75bbcdb-3953-4dbd-b80f-e3c487dc61fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.717836 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-kube-api-access-s6xc5" (OuterVolumeSpecName: "kube-api-access-s6xc5") pod "e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" (UID: "e75bbcdb-3953-4dbd-b80f-e3c487dc61fe"). InnerVolumeSpecName "kube-api-access-s6xc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.734466 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "60b19a5a-17c1-4990-af7b-f3636f9e1cd2" (UID: "60b19a5a-17c1-4990-af7b-f3636f9e1cd2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.735739 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-kube-api-access-7h8cp" (OuterVolumeSpecName: "kube-api-access-7h8cp") pod "60b19a5a-17c1-4990-af7b-f3636f9e1cd2" (UID: "60b19a5a-17c1-4990-af7b-f3636f9e1cd2"). InnerVolumeSpecName "kube-api-access-7h8cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.743589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" (UID: "e75bbcdb-3953-4dbd-b80f-e3c487dc61fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.755616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-config-data" (OuterVolumeSpecName: "config-data") pod "e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" (UID: "e75bbcdb-3953-4dbd-b80f-e3c487dc61fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.757952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60b19a5a-17c1-4990-af7b-f3636f9e1cd2" (UID: "60b19a5a-17c1-4990-af7b-f3636f9e1cd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.785557 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-config-data" (OuterVolumeSpecName: "config-data") pod "60b19a5a-17c1-4990-af7b-f3636f9e1cd2" (UID: "60b19a5a-17c1-4990-af7b-f3636f9e1cd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.812534 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.812565 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.812575 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.812584 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.812592 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.812602 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6xc5\" (UniqueName: \"kubernetes.io/projected/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-kube-api-access-s6xc5\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.812615 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:13 crc kubenswrapper[4749]: I0128 19:01:13.812625 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h8cp\" (UniqueName: \"kubernetes.io/projected/60b19a5a-17c1-4990-af7b-f3636f9e1cd2-kube-api-access-7h8cp\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.020937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dhv8b" event={"ID":"e75bbcdb-3953-4dbd-b80f-e3c487dc61fe","Type":"ContainerDied","Data":"44f89da9f63b272d40ba735624e055116d0b191b1ebec980d9a9b16a146bde95"} Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.020981 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44f89da9f63b272d40ba735624e055116d0b191b1ebec980d9a9b16a146bde95" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.020983 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dhv8b" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.024705 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493781-5zxzd" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.031495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493781-5zxzd" event={"ID":"60b19a5a-17c1-4990-af7b-f3636f9e1cd2","Type":"ContainerDied","Data":"dbca978ae654cff75472b444ef112b4924c223401e797f9fa5ebb4e8ea58c2a2"} Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.031543 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbca978ae654cff75472b444ef112b4924c223401e797f9fa5ebb4e8ea58c2a2" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.434569 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-jsdx8"] Jan 28 19:01:14 crc kubenswrapper[4749]: E0128 19:01:14.435464 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" containerName="nova-manage" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.435484 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" containerName="nova-manage" Jan 28 19:01:14 crc kubenswrapper[4749]: E0128 19:01:14.435511 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b19a5a-17c1-4990-af7b-f3636f9e1cd2" containerName="keystone-cron" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.435519 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b19a5a-17c1-4990-af7b-f3636f9e1cd2" containerName="keystone-cron" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.435817 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b19a5a-17c1-4990-af7b-f3636f9e1cd2" containerName="keystone-cron" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.435861 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" containerName="nova-manage" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.436718 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.447485 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-4445-account-create-update-k492h"] Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.449029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.456157 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.470736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jsdx8"] Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.508170 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-4445-account-create-update-k492h"] Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.530264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssxpv\" (UniqueName: \"kubernetes.io/projected/302b3dea-b921-4b69-9ef4-753a3aef5a2a-kube-api-access-ssxpv\") pod \"aodh-4445-account-create-update-k492h\" (UID: \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\") " pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.530308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tr66\" (UniqueName: \"kubernetes.io/projected/7ea37f1a-a23e-41cb-ae60-689f5aea8631-kube-api-access-8tr66\") pod \"aodh-db-create-jsdx8\" (UID: \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\") " pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.530417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/302b3dea-b921-4b69-9ef4-753a3aef5a2a-operator-scripts\") pod \"aodh-4445-account-create-update-k492h\" (UID: \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\") " pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.530518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea37f1a-a23e-41cb-ae60-689f5aea8631-operator-scripts\") pod \"aodh-db-create-jsdx8\" (UID: \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\") " pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.632815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/302b3dea-b921-4b69-9ef4-753a3aef5a2a-operator-scripts\") pod \"aodh-4445-account-create-update-k492h\" (UID: \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\") " pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.632944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea37f1a-a23e-41cb-ae60-689f5aea8631-operator-scripts\") pod \"aodh-db-create-jsdx8\" (UID: \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\") " pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.633130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssxpv\" (UniqueName: \"kubernetes.io/projected/302b3dea-b921-4b69-9ef4-753a3aef5a2a-kube-api-access-ssxpv\") pod \"aodh-4445-account-create-update-k492h\" (UID: \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\") " pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.633155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tr66\" (UniqueName: \"kubernetes.io/projected/7ea37f1a-a23e-41cb-ae60-689f5aea8631-kube-api-access-8tr66\") pod \"aodh-db-create-jsdx8\" (UID: \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\") " pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.634309 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/302b3dea-b921-4b69-9ef4-753a3aef5a2a-operator-scripts\") pod \"aodh-4445-account-create-update-k492h\" (UID: \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\") " pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.634310 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea37f1a-a23e-41cb-ae60-689f5aea8631-operator-scripts\") pod \"aodh-db-create-jsdx8\" (UID: \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\") " pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.656147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tr66\" (UniqueName: \"kubernetes.io/projected/7ea37f1a-a23e-41cb-ae60-689f5aea8631-kube-api-access-8tr66\") pod \"aodh-db-create-jsdx8\" (UID: \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\") " pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.657507 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssxpv\" (UniqueName: \"kubernetes.io/projected/302b3dea-b921-4b69-9ef4-753a3aef5a2a-kube-api-access-ssxpv\") pod \"aodh-4445-account-create-update-k492h\" (UID: \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\") " pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.771105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:14 crc kubenswrapper[4749]: I0128 19:01:14.787881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:15 crc kubenswrapper[4749]: I0128 19:01:15.043914 4749 generic.go:334] "Generic (PLEG): container finished" podID="acf9d598-9f13-42c7-9f82-9d6138a0d553" containerID="352f9833b2a26acbbdd6d3e25705e4404bdb55f7587ec441f1b36f07b86495fc" exitCode=0 Jan 28 19:01:15 crc kubenswrapper[4749]: I0128 19:01:15.044221 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" event={"ID":"acf9d598-9f13-42c7-9f82-9d6138a0d553","Type":"ContainerDied","Data":"352f9833b2a26acbbdd6d3e25705e4404bdb55f7587ec441f1b36f07b86495fc"} Jan 28 19:01:15 crc kubenswrapper[4749]: I0128 19:01:15.306289 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 19:01:15 crc kubenswrapper[4749]: I0128 19:01:15.307423 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 19:01:15 crc kubenswrapper[4749]: I0128 19:01:15.401008 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-4445-account-create-update-k492h"] Jan 28 19:01:15 crc kubenswrapper[4749]: I0128 19:01:15.438166 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jsdx8"] Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.055872 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ea37f1a-a23e-41cb-ae60-689f5aea8631" containerID="3bf32d6779d7074fd91e25898146450163b2578508ed8dd1a47a55bc584d0a51" exitCode=0 Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.056007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jsdx8" event={"ID":"7ea37f1a-a23e-41cb-ae60-689f5aea8631","Type":"ContainerDied","Data":"3bf32d6779d7074fd91e25898146450163b2578508ed8dd1a47a55bc584d0a51"} Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.056205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jsdx8" event={"ID":"7ea37f1a-a23e-41cb-ae60-689f5aea8631","Type":"ContainerStarted","Data":"01a550576895b361bbae3cb4678476dd815b3e2dc3f48aea1973b5b50d323ec2"} Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.058125 4749 generic.go:334] "Generic (PLEG): container finished" podID="302b3dea-b921-4b69-9ef4-753a3aef5a2a" containerID="f62c52e024b4d5f4042bbd1eb0be714728b16b3ffcd41161786dd875129eecb9" exitCode=0 Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.058339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4445-account-create-update-k492h" event={"ID":"302b3dea-b921-4b69-9ef4-753a3aef5a2a","Type":"ContainerDied","Data":"f62c52e024b4d5f4042bbd1eb0be714728b16b3ffcd41161786dd875129eecb9"} Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.058404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4445-account-create-update-k492h" event={"ID":"302b3dea-b921-4b69-9ef4-753a3aef5a2a","Type":"ContainerStarted","Data":"0498146aeeb9c695d0851e73e5f4098560b240551b0c0b3c652e509fdd2e7c38"} Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.535915 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.695271 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls75n\" (UniqueName: \"kubernetes.io/projected/acf9d598-9f13-42c7-9f82-9d6138a0d553-kube-api-access-ls75n\") pod \"acf9d598-9f13-42c7-9f82-9d6138a0d553\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.695343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-combined-ca-bundle\") pod \"acf9d598-9f13-42c7-9f82-9d6138a0d553\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.695406 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-config-data\") pod \"acf9d598-9f13-42c7-9f82-9d6138a0d553\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.695510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-scripts\") pod \"acf9d598-9f13-42c7-9f82-9d6138a0d553\" (UID: \"acf9d598-9f13-42c7-9f82-9d6138a0d553\") " Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.703269 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-scripts" (OuterVolumeSpecName: "scripts") pod "acf9d598-9f13-42c7-9f82-9d6138a0d553" (UID: "acf9d598-9f13-42c7-9f82-9d6138a0d553"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.704295 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf9d598-9f13-42c7-9f82-9d6138a0d553-kube-api-access-ls75n" (OuterVolumeSpecName: "kube-api-access-ls75n") pod "acf9d598-9f13-42c7-9f82-9d6138a0d553" (UID: "acf9d598-9f13-42c7-9f82-9d6138a0d553"). InnerVolumeSpecName "kube-api-access-ls75n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.733934 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acf9d598-9f13-42c7-9f82-9d6138a0d553" (UID: "acf9d598-9f13-42c7-9f82-9d6138a0d553"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.744220 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-config-data" (OuterVolumeSpecName: "config-data") pod "acf9d598-9f13-42c7-9f82-9d6138a0d553" (UID: "acf9d598-9f13-42c7-9f82-9d6138a0d553"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.798420 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls75n\" (UniqueName: \"kubernetes.io/projected/acf9d598-9f13-42c7-9f82-9d6138a0d553-kube-api-access-ls75n\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.798459 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.798473 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.798510 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acf9d598-9f13-42c7-9f82-9d6138a0d553-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:16 crc kubenswrapper[4749]: I0128 19:01:16.899924 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" probeResult="failure" output=< Jan 28 19:01:16 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:01:16 crc kubenswrapper[4749]: > Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.084162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" event={"ID":"acf9d598-9f13-42c7-9f82-9d6138a0d553","Type":"ContainerDied","Data":"5150d14ee08eedb8a32b732cc5fbae2ecc1205e156c3dcd0c2f4d6de3d5e7fb2"} Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.084480 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5150d14ee08eedb8a32b732cc5fbae2ecc1205e156c3dcd0c2f4d6de3d5e7fb2" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.084556 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bwj5s" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.102022 4749 generic.go:334] "Generic (PLEG): container finished" podID="451b80dc-7224-4bd0-a611-118ff9808edf" containerID="b3695c28d70a4698dc7cc2af278f555f840365b853c685e11dfd9672c7154747" exitCode=0 Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.102116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerDied","Data":"b3695c28d70a4698dc7cc2af278f555f840365b853c685e11dfd9672c7154747"} Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.166706 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 19:01:17 crc kubenswrapper[4749]: E0128 19:01:17.167348 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf9d598-9f13-42c7-9f82-9d6138a0d553" containerName="nova-cell1-conductor-db-sync" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.167363 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf9d598-9f13-42c7-9f82-9d6138a0d553" containerName="nova-cell1-conductor-db-sync" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.167586 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf9d598-9f13-42c7-9f82-9d6138a0d553" containerName="nova-cell1-conductor-db-sync" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.168467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.174035 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.181200 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.314031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81127c65-0c78-442f-83b3-1a060a1a0452-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.314145 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81127c65-0c78-442f-83b3-1a060a1a0452-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.314423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk294\" (UniqueName: \"kubernetes.io/projected/81127c65-0c78-442f-83b3-1a060a1a0452-kube-api-access-zk294\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.416729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk294\" (UniqueName: \"kubernetes.io/projected/81127c65-0c78-442f-83b3-1a060a1a0452-kube-api-access-zk294\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.416847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81127c65-0c78-442f-83b3-1a060a1a0452-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.416928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81127c65-0c78-442f-83b3-1a060a1a0452-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.421528 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.422663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81127c65-0c78-442f-83b3-1a060a1a0452-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.422852 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81127c65-0c78-442f-83b3-1a060a1a0452-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.435006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk294\" (UniqueName: \"kubernetes.io/projected/81127c65-0c78-442f-83b3-1a060a1a0452-kube-api-access-zk294\") pod \"nova-cell1-conductor-0\" (UID: \"81127c65-0c78-442f-83b3-1a060a1a0452\") " pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.511389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.518608 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-run-httpd\") pod \"451b80dc-7224-4bd0-a611-118ff9808edf\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.518727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-sg-core-conf-yaml\") pod \"451b80dc-7224-4bd0-a611-118ff9808edf\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.518806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-config-data\") pod \"451b80dc-7224-4bd0-a611-118ff9808edf\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.518862 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd8qk\" (UniqueName: \"kubernetes.io/projected/451b80dc-7224-4bd0-a611-118ff9808edf-kube-api-access-zd8qk\") pod \"451b80dc-7224-4bd0-a611-118ff9808edf\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.518919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-scripts\") pod \"451b80dc-7224-4bd0-a611-118ff9808edf\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.518978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-log-httpd\") pod \"451b80dc-7224-4bd0-a611-118ff9808edf\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.519175 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-combined-ca-bundle\") pod \"451b80dc-7224-4bd0-a611-118ff9808edf\" (UID: \"451b80dc-7224-4bd0-a611-118ff9808edf\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.520302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "451b80dc-7224-4bd0-a611-118ff9808edf" (UID: "451b80dc-7224-4bd0-a611-118ff9808edf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.520999 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "451b80dc-7224-4bd0-a611-118ff9808edf" (UID: "451b80dc-7224-4bd0-a611-118ff9808edf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.525375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451b80dc-7224-4bd0-a611-118ff9808edf-kube-api-access-zd8qk" (OuterVolumeSpecName: "kube-api-access-zd8qk") pod "451b80dc-7224-4bd0-a611-118ff9808edf" (UID: "451b80dc-7224-4bd0-a611-118ff9808edf"). InnerVolumeSpecName "kube-api-access-zd8qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.531768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-scripts" (OuterVolumeSpecName: "scripts") pod "451b80dc-7224-4bd0-a611-118ff9808edf" (UID: "451b80dc-7224-4bd0-a611-118ff9808edf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.563144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "451b80dc-7224-4bd0-a611-118ff9808edf" (UID: "451b80dc-7224-4bd0-a611-118ff9808edf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.622297 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.622356 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.622373 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd8qk\" (UniqueName: \"kubernetes.io/projected/451b80dc-7224-4bd0-a611-118ff9808edf-kube-api-access-zd8qk\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.622385 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.622395 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451b80dc-7224-4bd0-a611-118ff9808edf-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.626590 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "451b80dc-7224-4bd0-a611-118ff9808edf" (UID: "451b80dc-7224-4bd0-a611-118ff9808edf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.662467 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.700200 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.723176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea37f1a-a23e-41cb-ae60-689f5aea8631-operator-scripts\") pod \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\" (UID: \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.723660 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tr66\" (UniqueName: \"kubernetes.io/projected/7ea37f1a-a23e-41cb-ae60-689f5aea8631-kube-api-access-8tr66\") pod \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\" (UID: \"7ea37f1a-a23e-41cb-ae60-689f5aea8631\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.724607 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.725748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ea37f1a-a23e-41cb-ae60-689f5aea8631-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ea37f1a-a23e-41cb-ae60-689f5aea8631" (UID: "7ea37f1a-a23e-41cb-ae60-689f5aea8631"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.728056 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea37f1a-a23e-41cb-ae60-689f5aea8631-kube-api-access-8tr66" (OuterVolumeSpecName: "kube-api-access-8tr66") pod "7ea37f1a-a23e-41cb-ae60-689f5aea8631" (UID: "7ea37f1a-a23e-41cb-ae60-689f5aea8631"). InnerVolumeSpecName "kube-api-access-8tr66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.736080 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-config-data" (OuterVolumeSpecName: "config-data") pod "451b80dc-7224-4bd0-a611-118ff9808edf" (UID: "451b80dc-7224-4bd0-a611-118ff9808edf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.826587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssxpv\" (UniqueName: \"kubernetes.io/projected/302b3dea-b921-4b69-9ef4-753a3aef5a2a-kube-api-access-ssxpv\") pod \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\" (UID: \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.826856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/302b3dea-b921-4b69-9ef4-753a3aef5a2a-operator-scripts\") pod \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\" (UID: \"302b3dea-b921-4b69-9ef4-753a3aef5a2a\") " Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.827414 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451b80dc-7224-4bd0-a611-118ff9808edf-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.827430 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tr66\" (UniqueName: \"kubernetes.io/projected/7ea37f1a-a23e-41cb-ae60-689f5aea8631-kube-api-access-8tr66\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.827440 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea37f1a-a23e-41cb-ae60-689f5aea8631-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.827685 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302b3dea-b921-4b69-9ef4-753a3aef5a2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "302b3dea-b921-4b69-9ef4-753a3aef5a2a" (UID: "302b3dea-b921-4b69-9ef4-753a3aef5a2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.829612 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302b3dea-b921-4b69-9ef4-753a3aef5a2a-kube-api-access-ssxpv" (OuterVolumeSpecName: "kube-api-access-ssxpv") pod "302b3dea-b921-4b69-9ef4-753a3aef5a2a" (UID: "302b3dea-b921-4b69-9ef4-753a3aef5a2a"). InnerVolumeSpecName "kube-api-access-ssxpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.930010 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssxpv\" (UniqueName: \"kubernetes.io/projected/302b3dea-b921-4b69-9ef4-753a3aef5a2a-kube-api-access-ssxpv\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:17 crc kubenswrapper[4749]: I0128 19:01:17.930044 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/302b3dea-b921-4b69-9ef4-753a3aef5a2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.082716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.114380 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451b80dc-7224-4bd0-a611-118ff9808edf","Type":"ContainerDied","Data":"a8f5f1e947619978bbf4cf76d7053602290717200b3f63f969ec67af157ab908"} Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.114441 4749 scope.go:117] "RemoveContainer" containerID="f3bfa85f0aa5c7e97173869e6b40a4b14a92831e4e164d3aeff4f744968a4006" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.114508 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.121184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jsdx8" event={"ID":"7ea37f1a-a23e-41cb-ae60-689f5aea8631","Type":"ContainerDied","Data":"01a550576895b361bbae3cb4678476dd815b3e2dc3f48aea1973b5b50d323ec2"} Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.121307 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a550576895b361bbae3cb4678476dd815b3e2dc3f48aea1973b5b50d323ec2" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.121261 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jsdx8" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.123828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"81127c65-0c78-442f-83b3-1a060a1a0452","Type":"ContainerStarted","Data":"defd7d19e114bb53298b6dc646c3743d12fec468b07708d53ee10fcdd0915629"} Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.127697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4445-account-create-update-k492h" event={"ID":"302b3dea-b921-4b69-9ef4-753a3aef5a2a","Type":"ContainerDied","Data":"0498146aeeb9c695d0851e73e5f4098560b240551b0c0b3c652e509fdd2e7c38"} Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.127728 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0498146aeeb9c695d0851e73e5f4098560b240551b0c0b3c652e509fdd2e7c38" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.127786 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4445-account-create-update-k492h" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.206421 4749 scope.go:117] "RemoveContainer" containerID="ad2c5fd7a27007c73dbaf1cbff333a1334947b6ae4e234ae094600ecb3f759b7" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.243624 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.250556 4749 scope.go:117] "RemoveContainer" containerID="4f17aa195430061da8b0d8b496f51e05ef3911aea5a8bb99bf4399e8da6048ec" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.264645 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.279592 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:18 crc kubenswrapper[4749]: E0128 19:01:18.280249 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="ceilometer-central-agent" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280270 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="ceilometer-central-agent" Jan 28 19:01:18 crc kubenswrapper[4749]: E0128 19:01:18.280286 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302b3dea-b921-4b69-9ef4-753a3aef5a2a" containerName="mariadb-account-create-update" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280293 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="302b3dea-b921-4b69-9ef4-753a3aef5a2a" containerName="mariadb-account-create-update" Jan 28 19:01:18 crc kubenswrapper[4749]: E0128 19:01:18.280309 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="sg-core" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280316 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="sg-core" Jan 28 19:01:18 crc kubenswrapper[4749]: E0128 19:01:18.280434 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea37f1a-a23e-41cb-ae60-689f5aea8631" containerName="mariadb-database-create" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280443 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea37f1a-a23e-41cb-ae60-689f5aea8631" containerName="mariadb-database-create" Jan 28 19:01:18 crc kubenswrapper[4749]: E0128 19:01:18.280457 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="ceilometer-notification-agent" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280465 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="ceilometer-notification-agent" Jan 28 19:01:18 crc kubenswrapper[4749]: E0128 19:01:18.280482 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="proxy-httpd" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280489 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="proxy-httpd" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280699 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="302b3dea-b921-4b69-9ef4-753a3aef5a2a" containerName="mariadb-account-create-update" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280719 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="sg-core" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280729 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="ceilometer-notification-agent" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280742 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea37f1a-a23e-41cb-ae60-689f5aea8631" containerName="mariadb-database-create" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280752 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="ceilometer-central-agent" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.280763 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" containerName="proxy-httpd" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.282981 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.285881 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.286003 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.290001 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.290104 4749 scope.go:117] "RemoveContainer" containerID="b3695c28d70a4698dc7cc2af278f555f840365b853c685e11dfd9672c7154747" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.345248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.345405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-config-data\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.345530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-scripts\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.345687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-run-httpd\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.345771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.345866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r56l\" (UniqueName: \"kubernetes.io/projected/42c65a95-0176-43fc-a216-0bd92417240d-kube-api-access-9r56l\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.346044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-log-httpd\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.448870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-run-httpd\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.448928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.448979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r56l\" (UniqueName: \"kubernetes.io/projected/42c65a95-0176-43fc-a216-0bd92417240d-kube-api-access-9r56l\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.449034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-log-httpd\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.449148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.449189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-config-data\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.449265 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-scripts\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.449499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-run-httpd\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.450573 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-log-httpd\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.454124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.454210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-scripts\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.454875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-config-data\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.455012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.467654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r56l\" (UniqueName: \"kubernetes.io/projected/42c65a95-0176-43fc-a216-0bd92417240d-kube-api-access-9r56l\") pod \"ceilometer-0\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.611725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:18 crc kubenswrapper[4749]: I0128 19:01:18.912995 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451b80dc-7224-4bd0-a611-118ff9808edf" path="/var/lib/kubelet/pods/451b80dc-7224-4bd0-a611-118ff9808edf/volumes" Jan 28 19:01:19 crc kubenswrapper[4749]: W0128 19:01:19.093304 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c65a95_0176_43fc_a216_0bd92417240d.slice/crio-1b4fe75607cf510d0001bac3ded684593e3beb30cb0a343e58f7a6064e3ba369 WatchSource:0}: Error finding container 1b4fe75607cf510d0001bac3ded684593e3beb30cb0a343e58f7a6064e3ba369: Status 404 returned error can't find the container with id 1b4fe75607cf510d0001bac3ded684593e3beb30cb0a343e58f7a6064e3ba369 Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.093347 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.140459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerStarted","Data":"1b4fe75607cf510d0001bac3ded684593e3beb30cb0a343e58f7a6064e3ba369"} Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.143598 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"81127c65-0c78-442f-83b3-1a060a1a0452","Type":"ContainerStarted","Data":"b61b3af67c7f10a25f15505b8cf9ad4853aa90961146bf476da4c2e984eaa54f"} Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.145122 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.166762 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.166737901 podStartE2EDuration="2.166737901s" podCreationTimestamp="2026-01-28 19:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:19.157197273 +0000 UTC m=+1547.168724068" watchObservedRunningTime="2026-01-28 19:01:19.166737901 +0000 UTC m=+1547.178264676" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.779882 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-jvjll"] Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.783454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.787703 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.787912 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.788022 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.788048 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-qbtm4" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.798994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-jvjll"] Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.886901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-scripts\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.887168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-config-data\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.887523 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-combined-ca-bundle\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.887569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phdjg\" (UniqueName: \"kubernetes.io/projected/450c0e30-35b1-49d3-b11d-996b73fb11e8-kube-api-access-phdjg\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.989889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-config-data\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.990100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-combined-ca-bundle\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.990140 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phdjg\" (UniqueName: \"kubernetes.io/projected/450c0e30-35b1-49d3-b11d-996b73fb11e8-kube-api-access-phdjg\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.990215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-scripts\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.995856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-scripts\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:19 crc kubenswrapper[4749]: I0128 19:01:19.996303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-config-data\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.001029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-combined-ca-bundle\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.008506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phdjg\" (UniqueName: \"kubernetes.io/projected/450c0e30-35b1-49d3-b11d-996b73fb11e8-kube-api-access-phdjg\") pod \"aodh-db-sync-jvjll\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.109071 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.176128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerStarted","Data":"ffacc808180808e02ca1d42c4c565d65b86c70326a0b4615289f58b48af2341a"} Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.307063 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.307454 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.427891 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.427936 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 19:01:20 crc kubenswrapper[4749]: I0128 19:01:20.699954 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-jvjll"] Jan 28 19:01:20 crc kubenswrapper[4749]: W0128 19:01:20.736667 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod450c0e30_35b1_49d3_b11d_996b73fb11e8.slice/crio-d6650a2cb686bf57ce05b9a537115095e493a40502218b7a663478ca393560da WatchSource:0}: Error finding container d6650a2cb686bf57ce05b9a537115095e493a40502218b7a663478ca393560da: Status 404 returned error can't find the container with id d6650a2cb686bf57ce05b9a537115095e493a40502218b7a663478ca393560da Jan 28 19:01:21 crc kubenswrapper[4749]: I0128 19:01:21.193294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jvjll" event={"ID":"450c0e30-35b1-49d3-b11d-996b73fb11e8","Type":"ContainerStarted","Data":"d6650a2cb686bf57ce05b9a537115095e493a40502218b7a663478ca393560da"} Jan 28 19:01:21 crc kubenswrapper[4749]: I0128 19:01:21.357061 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 28 19:01:21 crc kubenswrapper[4749]: I0128 19:01:21.361513 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:21 crc kubenswrapper[4749]: I0128 19:01:21.361533 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:21 crc kubenswrapper[4749]: I0128 19:01:21.511533 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.246:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:21 crc kubenswrapper[4749]: I0128 19:01:21.511530 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.246:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:22 crc kubenswrapper[4749]: I0128 19:01:22.233527 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerStarted","Data":"0d83e4b723af93b84e5f442fdfc2c3dd4a046c0e4a2de26f3fbd019d87d92431"} Jan 28 19:01:22 crc kubenswrapper[4749]: I0128 19:01:22.319878 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:22 crc kubenswrapper[4749]: I0128 19:01:22.320133 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-log" containerID="cri-o://c9e30660453dfce533b916786481a5960055edee0c9971434aa5ccd60d84c0d9" gracePeriod=30 Jan 28 19:01:22 crc kubenswrapper[4749]: I0128 19:01:22.320295 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-api" containerID="cri-o://de308d673e6252583bd077d5d179053b8ec39baa0df88576dd2454a52bebc395" gracePeriod=30 Jan 28 19:01:22 crc kubenswrapper[4749]: I0128 19:01:22.381054 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:22 crc kubenswrapper[4749]: I0128 19:01:22.381421 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-log" containerID="cri-o://d7e7a8c5ca28a2eb850d3ff75ac33caee033f618897a27307d2c224fbb9011db" gracePeriod=30 Jan 28 19:01:22 crc kubenswrapper[4749]: I0128 19:01:22.381697 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-metadata" containerID="cri-o://275062d2b99a8ebb8541ea4f63293aece8a7b01e006d8b2259a867e78c05f92b" gracePeriod=30 Jan 28 19:01:22 crc kubenswrapper[4749]: E0128 19:01:22.915122 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c60b12f_ab80_4d1e_957f_15d57d5cd36c.slice/crio-c9e30660453dfce533b916786481a5960055edee0c9971434aa5ccd60d84c0d9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7abe3a3_c49a_46fe_9f47_2099f07a3418.slice/crio-conmon-d7e7a8c5ca28a2eb850d3ff75ac33caee033f618897a27307d2c224fbb9011db.scope\": RecentStats: unable to find data in memory cache]" Jan 28 19:01:23 crc kubenswrapper[4749]: I0128 19:01:23.259212 4749 generic.go:334] "Generic (PLEG): container finished" podID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerID="d7e7a8c5ca28a2eb850d3ff75ac33caee033f618897a27307d2c224fbb9011db" exitCode=143 Jan 28 19:01:23 crc kubenswrapper[4749]: I0128 19:01:23.259670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7abe3a3-c49a-46fe-9f47-2099f07a3418","Type":"ContainerDied","Data":"d7e7a8c5ca28a2eb850d3ff75ac33caee033f618897a27307d2c224fbb9011db"} Jan 28 19:01:23 crc kubenswrapper[4749]: I0128 19:01:23.266518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerStarted","Data":"68549bc3157660e557505e860c2c608e3ec7a934932f49f2696ed18bee1c69e0"} Jan 28 19:01:23 crc kubenswrapper[4749]: I0128 19:01:23.278896 4749 generic.go:334] "Generic (PLEG): container finished" podID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerID="c9e30660453dfce533b916786481a5960055edee0c9971434aa5ccd60d84c0d9" exitCode=143 Jan 28 19:01:23 crc kubenswrapper[4749]: I0128 19:01:23.278964 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c60b12f-ab80-4d1e-957f-15d57d5cd36c","Type":"ContainerDied","Data":"c9e30660453dfce533b916786481a5960055edee0c9971434aa5ccd60d84c0d9"} Jan 28 19:01:25 crc kubenswrapper[4749]: I0128 19:01:25.303148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerStarted","Data":"5c10778771e3a97f700d66e63d6ae614d1d9bd6e7bb03877e90c4a8c81d9daea"} Jan 28 19:01:25 crc kubenswrapper[4749]: I0128 19:01:25.304008 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 19:01:25 crc kubenswrapper[4749]: I0128 19:01:25.336291 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.08701961 podStartE2EDuration="7.336272601s" podCreationTimestamp="2026-01-28 19:01:18 +0000 UTC" firstStartedPulling="2026-01-28 19:01:19.095847076 +0000 UTC m=+1547.107373851" lastFinishedPulling="2026-01-28 19:01:24.345100067 +0000 UTC m=+1552.356626842" observedRunningTime="2026-01-28 19:01:25.326072487 +0000 UTC m=+1553.337599302" watchObservedRunningTime="2026-01-28 19:01:25.336272601 +0000 UTC m=+1553.347799376" Jan 28 19:01:25 crc kubenswrapper[4749]: I0128 19:01:25.908944 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 19:01:25 crc kubenswrapper[4749]: I0128 19:01:25.969590 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 19:01:26 crc kubenswrapper[4749]: I0128 19:01:26.155746 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5mr8"] Jan 28 19:01:27 crc kubenswrapper[4749]: I0128 19:01:27.325292 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n5mr8" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" containerID="cri-o://33ca4c7b511939cde9c2d3023340f2eb8af79c630f3f9f74f5285f25d1b5d214" gracePeriod=2 Jan 28 19:01:27 crc kubenswrapper[4749]: I0128 19:01:27.467981 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:01:27 crc kubenswrapper[4749]: I0128 19:01:27.468048 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:01:27 crc kubenswrapper[4749]: I0128 19:01:27.468113 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:01:27 crc kubenswrapper[4749]: I0128 19:01:27.469059 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:01:27 crc kubenswrapper[4749]: I0128 19:01:27.469116 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" gracePeriod=600 Jan 28 19:01:27 crc kubenswrapper[4749]: I0128 19:01:27.541984 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 28 19:01:27 crc kubenswrapper[4749]: I0128 19:01:27.620588 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-5xs5q" podUID="d597ae59-efbc-48b1-9f21-51ead30e9812" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.524142 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" exitCode=0 Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.524217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e"} Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.524822 4749 scope.go:117] "RemoveContainer" containerID="6f6753aa33414e3e6ec1468bb7657379fc59db0884f9f3f2e5c921382fe9d6fb" Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.530070 4749 generic.go:334] "Generic (PLEG): container finished" podID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerID="de308d673e6252583bd077d5d179053b8ec39baa0df88576dd2454a52bebc395" exitCode=0 Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.530143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c60b12f-ab80-4d1e-957f-15d57d5cd36c","Type":"ContainerDied","Data":"de308d673e6252583bd077d5d179053b8ec39baa0df88576dd2454a52bebc395"} Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.536651 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerID="33ca4c7b511939cde9c2d3023340f2eb8af79c630f3f9f74f5285f25d1b5d214" exitCode=0 Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.536711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerDied","Data":"33ca4c7b511939cde9c2d3023340f2eb8af79c630f3f9f74f5285f25d1b5d214"} Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.538940 4749 generic.go:334] "Generic (PLEG): container finished" podID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerID="275062d2b99a8ebb8541ea4f63293aece8a7b01e006d8b2259a867e78c05f92b" exitCode=0 Jan 28 19:01:29 crc kubenswrapper[4749]: I0128 19:01:29.538965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7abe3a3-c49a-46fe-9f47-2099f07a3418","Type":"ContainerDied","Data":"275062d2b99a8ebb8541ea4f63293aece8a7b01e006d8b2259a867e78c05f92b"} Jan 28 19:01:32 crc kubenswrapper[4749]: E0128 19:01:32.963509 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.074254 4749 scope.go:117] "RemoveContainer" containerID="4c791ddf7b0951e536637ecce5ac329ade0090d4a43dab367ede7e34c5c4e425" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.292269 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.532286 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.566751 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.578777 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.593872 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.594311 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.601453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c60b12f-ab80-4d1e-957f-15d57d5cd36c","Type":"ContainerDied","Data":"2f13536a529566499b821275d2c40c6ed50592debe426cf6687576a90fa82bd3"} Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.601473 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.602038 4749 scope.go:117] "RemoveContainer" containerID="de308d673e6252583bd077d5d179053b8ec39baa0df88576dd2454a52bebc395" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-combined-ca-bundle\") pod \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnz6j\" (UniqueName: \"kubernetes.io/projected/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-kube-api-access-mnz6j\") pod \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606314 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-nova-metadata-tls-certs\") pod \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606363 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkdtt\" (UniqueName: \"kubernetes.io/projected/f7abe3a3-c49a-46fe-9f47-2099f07a3418-kube-api-access-lkdtt\") pod \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606517 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m2zt\" (UniqueName: \"kubernetes.io/projected/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-kube-api-access-8m2zt\") pod \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606595 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-utilities\") pod \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-config-data\") pod \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606750 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-logs\") pod \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-catalog-content\") pod \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\" (UID: \"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7abe3a3-c49a-46fe-9f47-2099f07a3418-logs\") pod \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606927 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-combined-ca-bundle\") pod \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\" (UID: \"4c60b12f-ab80-4d1e-957f-15d57d5cd36c\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.606952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-config-data\") pod \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\" (UID: \"f7abe3a3-c49a-46fe-9f47-2099f07a3418\") " Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.607345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5mr8" event={"ID":"a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d","Type":"ContainerDied","Data":"e03aec62389f583782afba9ed42dbfa19e024caacab4d34fd363d7b050c01eef"} Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.607475 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5mr8" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.608544 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-logs" (OuterVolumeSpecName: "logs") pod "4c60b12f-ab80-4d1e-957f-15d57d5cd36c" (UID: "4c60b12f-ab80-4d1e-957f-15d57d5cd36c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.611034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7abe3a3-c49a-46fe-9f47-2099f07a3418-logs" (OuterVolumeSpecName: "logs") pod "f7abe3a3-c49a-46fe-9f47-2099f07a3418" (UID: "f7abe3a3-c49a-46fe-9f47-2099f07a3418"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.611124 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-utilities" (OuterVolumeSpecName: "utilities") pod "a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" (UID: "a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.613418 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.613470 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-logs\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.613479 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7abe3a3-c49a-46fe-9f47-2099f07a3418-logs\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.614837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f7abe3a3-c49a-46fe-9f47-2099f07a3418","Type":"ContainerDied","Data":"515640c2b0f3d7f6e97e1a924d99d8a9b139e44a8feff59afc319252555c7277"} Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.614937 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.631989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7abe3a3-c49a-46fe-9f47-2099f07a3418-kube-api-access-lkdtt" (OuterVolumeSpecName: "kube-api-access-lkdtt") pod "f7abe3a3-c49a-46fe-9f47-2099f07a3418" (UID: "f7abe3a3-c49a-46fe-9f47-2099f07a3418"). InnerVolumeSpecName "kube-api-access-lkdtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.635701 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-kube-api-access-mnz6j" (OuterVolumeSpecName: "kube-api-access-mnz6j") pod "a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" (UID: "a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d"). InnerVolumeSpecName "kube-api-access-mnz6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.642021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-kube-api-access-8m2zt" (OuterVolumeSpecName: "kube-api-access-8m2zt") pod "4c60b12f-ab80-4d1e-957f-15d57d5cd36c" (UID: "4c60b12f-ab80-4d1e-957f-15d57d5cd36c"). InnerVolumeSpecName "kube-api-access-8m2zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.652592 4749 scope.go:117] "RemoveContainer" containerID="c9e30660453dfce533b916786481a5960055edee0c9971434aa5ccd60d84c0d9" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.653727 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c60b12f-ab80-4d1e-957f-15d57d5cd36c" (UID: "4c60b12f-ab80-4d1e-957f-15d57d5cd36c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.663999 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7abe3a3-c49a-46fe-9f47-2099f07a3418" (UID: "f7abe3a3-c49a-46fe-9f47-2099f07a3418"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.691654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-config-data" (OuterVolumeSpecName: "config-data") pod "4c60b12f-ab80-4d1e-957f-15d57d5cd36c" (UID: "4c60b12f-ab80-4d1e-957f-15d57d5cd36c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.694936 4749 scope.go:117] "RemoveContainer" containerID="33ca4c7b511939cde9c2d3023340f2eb8af79c630f3f9f74f5285f25d1b5d214" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.717495 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-config-data" (OuterVolumeSpecName: "config-data") pod "f7abe3a3-c49a-46fe-9f47-2099f07a3418" (UID: "f7abe3a3-c49a-46fe-9f47-2099f07a3418"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.718902 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m2zt\" (UniqueName: \"kubernetes.io/projected/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-kube-api-access-8m2zt\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.718960 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.718976 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c60b12f-ab80-4d1e-957f-15d57d5cd36c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.718989 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.719002 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.719020 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnz6j\" (UniqueName: \"kubernetes.io/projected/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-kube-api-access-mnz6j\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.719032 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkdtt\" (UniqueName: \"kubernetes.io/projected/f7abe3a3-c49a-46fe-9f47-2099f07a3418-kube-api-access-lkdtt\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.727938 4749 scope.go:117] "RemoveContainer" containerID="0bbefaa424dab42ca864a3aab11db13f65a5aaf094cc18d7f00ed0441f4def8c" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.753588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f7abe3a3-c49a-46fe-9f47-2099f07a3418" (UID: "f7abe3a3-c49a-46fe-9f47-2099f07a3418"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.770709 4749 scope.go:117] "RemoveContainer" containerID="5f245ece56dd3fbf29d85f17cb9590383b21f6462ae3111510e207040edb4e9c" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.795802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" (UID: "a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.796084 4749 scope.go:117] "RemoveContainer" containerID="275062d2b99a8ebb8541ea4f63293aece8a7b01e006d8b2259a867e78c05f92b" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.821588 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.821644 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7abe3a3-c49a-46fe-9f47-2099f07a3418-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.836647 4749 scope.go:117] "RemoveContainer" containerID="d7e7a8c5ca28a2eb850d3ff75ac33caee033f618897a27307d2c224fbb9011db" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.956965 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.971407 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.982489 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.983007 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983036 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.983045 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="extract-content" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983058 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="extract-content" Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.983075 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="extract-utilities" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983082 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="extract-utilities" Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.983095 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-log" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983101 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-log" Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.983114 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-metadata" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983120 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-metadata" Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.983131 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983136 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.983156 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-log" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983162 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-log" Jan 28 19:01:33 crc kubenswrapper[4749]: E0128 19:01:33.983180 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-api" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983186 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-api" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983407 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-log" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983430 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983440 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" containerName="nova-metadata-metadata" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983451 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-log" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983462 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" containerName="nova-api-api" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.983860 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" containerName="registry-server" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.984695 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.995274 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 19:01:33 crc kubenswrapper[4749]: I0128 19:01:33.995732 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.025786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/418dea88-d8b7-43db-8760-6b055d9f6b04-logs\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.025832 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-config-data\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.025859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txbmk\" (UniqueName: \"kubernetes.io/projected/418dea88-d8b7-43db-8760-6b055d9f6b04-kube-api-access-txbmk\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.026248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.057004 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5mr8"] Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.073064 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n5mr8"] Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.088229 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.099527 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.131272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txbmk\" (UniqueName: \"kubernetes.io/projected/418dea88-d8b7-43db-8760-6b055d9f6b04-kube-api-access-txbmk\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.131464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.131551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/418dea88-d8b7-43db-8760-6b055d9f6b04-logs\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.131581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-config-data\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.137387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/418dea88-d8b7-43db-8760-6b055d9f6b04-logs\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.138109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.140462 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.156512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-config-data\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.161263 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.164854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txbmk\" (UniqueName: \"kubernetes.io/projected/418dea88-d8b7-43db-8760-6b055d9f6b04-kube-api-access-txbmk\") pod \"nova-api-0\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.165383 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.165658 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.182482 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.233367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.233582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d411e3b-b35c-4f9b-8463-4b18218c6693-logs\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.233661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg794\" (UniqueName: \"kubernetes.io/projected/9d411e3b-b35c-4f9b-8463-4b18218c6693-kube-api-access-sg794\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.233678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-config-data\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.233702 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.335692 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d411e3b-b35c-4f9b-8463-4b18218c6693-logs\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.336076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg794\" (UniqueName: \"kubernetes.io/projected/9d411e3b-b35c-4f9b-8463-4b18218c6693-kube-api-access-sg794\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.336097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-config-data\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.336119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.336128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d411e3b-b35c-4f9b-8463-4b18218c6693-logs\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.336208 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.339835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-config-data\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.340171 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.340707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.354967 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.355897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg794\" (UniqueName: \"kubernetes.io/projected/9d411e3b-b35c-4f9b-8463-4b18218c6693-kube-api-access-sg794\") pod \"nova-metadata-0\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.548892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.630769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jvjll" event={"ID":"450c0e30-35b1-49d3-b11d-996b73fb11e8","Type":"ContainerStarted","Data":"532837a3a9279aba0aaf8f5c1ca2cfeb44671c0c41344be9f94f326ca727713b"} Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.662122 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-jvjll" podStartSLOduration=3.114224975 podStartE2EDuration="15.662103032s" podCreationTimestamp="2026-01-28 19:01:19 +0000 UTC" firstStartedPulling="2026-01-28 19:01:20.739066486 +0000 UTC m=+1548.750593261" lastFinishedPulling="2026-01-28 19:01:33.286944543 +0000 UTC m=+1561.298471318" observedRunningTime="2026-01-28 19:01:34.651660782 +0000 UTC m=+1562.663187567" watchObservedRunningTime="2026-01-28 19:01:34.662103032 +0000 UTC m=+1562.673629817" Jan 28 19:01:34 crc kubenswrapper[4749]: W0128 19:01:34.801018 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod418dea88_d8b7_43db_8760_6b055d9f6b04.slice/crio-30735ad05c23c0e5a5c8e5663f37b8f977fd678d81c619bf6173c65bdde2d411 WatchSource:0}: Error finding container 30735ad05c23c0e5a5c8e5663f37b8f977fd678d81c619bf6173c65bdde2d411: Status 404 returned error can't find the container with id 30735ad05c23c0e5a5c8e5663f37b8f977fd678d81c619bf6173c65bdde2d411 Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.804282 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.890845 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c60b12f-ab80-4d1e-957f-15d57d5cd36c" path="/var/lib/kubelet/pods/4c60b12f-ab80-4d1e-957f-15d57d5cd36c/volumes" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.891772 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d" path="/var/lib/kubelet/pods/a7ac6957-d7ec-4aee-9a3e-3b0b3ebbbb0d/volumes" Jan 28 19:01:34 crc kubenswrapper[4749]: I0128 19:01:34.892981 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7abe3a3-c49a-46fe-9f47-2099f07a3418" path="/var/lib/kubelet/pods/f7abe3a3-c49a-46fe-9f47-2099f07a3418/volumes" Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.019668 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.657137 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"418dea88-d8b7-43db-8760-6b055d9f6b04","Type":"ContainerStarted","Data":"6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970"} Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.658109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"418dea88-d8b7-43db-8760-6b055d9f6b04","Type":"ContainerStarted","Data":"1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1"} Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.658256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"418dea88-d8b7-43db-8760-6b055d9f6b04","Type":"ContainerStarted","Data":"30735ad05c23c0e5a5c8e5663f37b8f977fd678d81c619bf6173c65bdde2d411"} Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.662838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d411e3b-b35c-4f9b-8463-4b18218c6693","Type":"ContainerStarted","Data":"f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34"} Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.662903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d411e3b-b35c-4f9b-8463-4b18218c6693","Type":"ContainerStarted","Data":"639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07"} Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.662918 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d411e3b-b35c-4f9b-8463-4b18218c6693","Type":"ContainerStarted","Data":"bd55eced5d3325a9e6bf16e3aa31de225a8fbeeba783594651957556ab7a9747"} Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.692560 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.692538392 podStartE2EDuration="2.692538392s" podCreationTimestamp="2026-01-28 19:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:35.684088152 +0000 UTC m=+1563.695614937" watchObservedRunningTime="2026-01-28 19:01:35.692538392 +0000 UTC m=+1563.704065167" Jan 28 19:01:35 crc kubenswrapper[4749]: I0128 19:01:35.726635 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.72661517 podStartE2EDuration="1.72661517s" podCreationTimestamp="2026-01-28 19:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:35.70607324 +0000 UTC m=+1563.717600025" watchObservedRunningTime="2026-01-28 19:01:35.72661517 +0000 UTC m=+1563.738141935" Jan 28 19:01:36 crc kubenswrapper[4749]: I0128 19:01:36.674039 4749 generic.go:334] "Generic (PLEG): container finished" podID="db2b8c43-d40b-4cc6-9b84-d9d43660af11" containerID="ef7cf3b768ab072ae74ed4e2a7429c5d8bb818e1c5fc9d696b07e3b417eb0226" exitCode=137 Jan 28 19:01:36 crc kubenswrapper[4749]: I0128 19:01:36.674117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db2b8c43-d40b-4cc6-9b84-d9d43660af11","Type":"ContainerDied","Data":"ef7cf3b768ab072ae74ed4e2a7429c5d8bb818e1c5fc9d696b07e3b417eb0226"} Jan 28 19:01:36 crc kubenswrapper[4749]: I0128 19:01:36.677043 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd37ff46-765e-4543-ba21-e5bc7df64457" containerID="f0cd6422e11d34a5546dd497c9e24100996cb616afe6a832cd8df16274a019a9" exitCode=137 Jan 28 19:01:36 crc kubenswrapper[4749]: I0128 19:01:36.677350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd37ff46-765e-4543-ba21-e5bc7df64457","Type":"ContainerDied","Data":"f0cd6422e11d34a5546dd497c9e24100996cb616afe6a832cd8df16274a019a9"} Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.335112 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.347254 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.423518 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-config-data\") pod \"cd37ff46-765e-4543-ba21-e5bc7df64457\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.424378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-combined-ca-bundle\") pod \"cd37ff46-765e-4543-ba21-e5bc7df64457\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.424593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w8mm\" (UniqueName: \"kubernetes.io/projected/db2b8c43-d40b-4cc6-9b84-d9d43660af11-kube-api-access-2w8mm\") pod \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.424747 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-combined-ca-bundle\") pod \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.424783 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l95nv\" (UniqueName: \"kubernetes.io/projected/cd37ff46-765e-4543-ba21-e5bc7df64457-kube-api-access-l95nv\") pod \"cd37ff46-765e-4543-ba21-e5bc7df64457\" (UID: \"cd37ff46-765e-4543-ba21-e5bc7df64457\") " Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.424885 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-config-data\") pod \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\" (UID: \"db2b8c43-d40b-4cc6-9b84-d9d43660af11\") " Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.430712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2b8c43-d40b-4cc6-9b84-d9d43660af11-kube-api-access-2w8mm" (OuterVolumeSpecName: "kube-api-access-2w8mm") pod "db2b8c43-d40b-4cc6-9b84-d9d43660af11" (UID: "db2b8c43-d40b-4cc6-9b84-d9d43660af11"). InnerVolumeSpecName "kube-api-access-2w8mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.430813 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd37ff46-765e-4543-ba21-e5bc7df64457-kube-api-access-l95nv" (OuterVolumeSpecName: "kube-api-access-l95nv") pod "cd37ff46-765e-4543-ba21-e5bc7df64457" (UID: "cd37ff46-765e-4543-ba21-e5bc7df64457"). InnerVolumeSpecName "kube-api-access-l95nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.457083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db2b8c43-d40b-4cc6-9b84-d9d43660af11" (UID: "db2b8c43-d40b-4cc6-9b84-d9d43660af11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.457435 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-config-data" (OuterVolumeSpecName: "config-data") pod "cd37ff46-765e-4543-ba21-e5bc7df64457" (UID: "cd37ff46-765e-4543-ba21-e5bc7df64457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.462514 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-config-data" (OuterVolumeSpecName: "config-data") pod "db2b8c43-d40b-4cc6-9b84-d9d43660af11" (UID: "db2b8c43-d40b-4cc6-9b84-d9d43660af11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.467065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd37ff46-765e-4543-ba21-e5bc7df64457" (UID: "cd37ff46-765e-4543-ba21-e5bc7df64457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.527882 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w8mm\" (UniqueName: \"kubernetes.io/projected/db2b8c43-d40b-4cc6-9b84-d9d43660af11-kube-api-access-2w8mm\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.528140 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.528200 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l95nv\" (UniqueName: \"kubernetes.io/projected/cd37ff46-765e-4543-ba21-e5bc7df64457-kube-api-access-l95nv\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.528253 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2b8c43-d40b-4cc6-9b84-d9d43660af11-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.528302 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.528796 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd37ff46-765e-4543-ba21-e5bc7df64457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.689717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db2b8c43-d40b-4cc6-9b84-d9d43660af11","Type":"ContainerDied","Data":"5ecf908db1b83cec273a8e0519fd61604f9f3723e0f10f5f35cb8f3bbb7ab2a4"} Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.689767 4749 scope.go:117] "RemoveContainer" containerID="ef7cf3b768ab072ae74ed4e2a7429c5d8bb818e1c5fc9d696b07e3b417eb0226" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.689741 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.692189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd37ff46-765e-4543-ba21-e5bc7df64457","Type":"ContainerDied","Data":"726ab1a040ac2ecf80760fea79ba93d9b7ddcf5b258a600f7ef33c4a854cd393"} Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.692263 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.743824 4749 scope.go:117] "RemoveContainer" containerID="f0cd6422e11d34a5546dd497c9e24100996cb616afe6a832cd8df16274a019a9" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.750814 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.803785 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.816358 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.829036 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:01:37 crc kubenswrapper[4749]: E0128 19:01:37.829718 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd37ff46-765e-4543-ba21-e5bc7df64457" containerName="nova-scheduler-scheduler" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.829754 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd37ff46-765e-4543-ba21-e5bc7df64457" containerName="nova-scheduler-scheduler" Jan 28 19:01:37 crc kubenswrapper[4749]: E0128 19:01:37.829784 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2b8c43-d40b-4cc6-9b84-d9d43660af11" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.829793 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2b8c43-d40b-4cc6-9b84-d9d43660af11" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.830083 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd37ff46-765e-4543-ba21-e5bc7df64457" containerName="nova-scheduler-scheduler" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.830109 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2b8c43-d40b-4cc6-9b84-d9d43660af11" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.831318 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.834859 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.835130 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.835301 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.840202 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.851794 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.853412 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.855490 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.869524 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.881773 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.939530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.939641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.939753 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mk4t\" (UniqueName: \"kubernetes.io/projected/95cae223-462c-417b-b24e-34fb9e61b186-kube-api-access-8mk4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.939905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-config-data\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.939990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.940017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j7h6\" (UniqueName: \"kubernetes.io/projected/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-kube-api-access-7j7h6\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.940041 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:37 crc kubenswrapper[4749]: I0128 19:01:37.940089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.043129 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.043234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mk4t\" (UniqueName: \"kubernetes.io/projected/95cae223-462c-417b-b24e-34fb9e61b186-kube-api-access-8mk4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.043298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-config-data\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.043424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.043446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j7h6\" (UniqueName: \"kubernetes.io/projected/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-kube-api-access-7j7h6\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.043471 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.043506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.043677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.048001 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.048091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.048746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.049182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.050750 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95cae223-462c-417b-b24e-34fb9e61b186-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.050758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-config-data\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.062057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mk4t\" (UniqueName: \"kubernetes.io/projected/95cae223-462c-417b-b24e-34fb9e61b186-kube-api-access-8mk4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"95cae223-462c-417b-b24e-34fb9e61b186\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.062398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j7h6\" (UniqueName: \"kubernetes.io/projected/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-kube-api-access-7j7h6\") pod \"nova-scheduler-0\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " pod="openstack/nova-scheduler-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.155475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.175029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:01:38 crc kubenswrapper[4749]: W0128 19:01:38.667733 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95cae223_462c_417b_b24e_34fb9e61b186.slice/crio-34fd28ed97414932b94a4daaad1146e2d280e8a90446601d7bbea3e1ef054cf6 WatchSource:0}: Error finding container 34fd28ed97414932b94a4daaad1146e2d280e8a90446601d7bbea3e1ef054cf6: Status 404 returned error can't find the container with id 34fd28ed97414932b94a4daaad1146e2d280e8a90446601d7bbea3e1ef054cf6 Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.672193 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.684089 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.703950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fabdc806-55eb-4e8c-9105-dbbcc33f56ba","Type":"ContainerStarted","Data":"f40f6c2f7d51587a845fc8cc29f7562d4934cccea4c0e88d056bcbd582ba3a8e"} Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.709107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"95cae223-462c-417b-b24e-34fb9e61b186","Type":"ContainerStarted","Data":"34fd28ed97414932b94a4daaad1146e2d280e8a90446601d7bbea3e1ef054cf6"} Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.889378 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd37ff46-765e-4543-ba21-e5bc7df64457" path="/var/lib/kubelet/pods/cd37ff46-765e-4543-ba21-e5bc7df64457/volumes" Jan 28 19:01:38 crc kubenswrapper[4749]: I0128 19:01:38.890178 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2b8c43-d40b-4cc6-9b84-d9d43660af11" path="/var/lib/kubelet/pods/db2b8c43-d40b-4cc6-9b84-d9d43660af11/volumes" Jan 28 19:01:39 crc kubenswrapper[4749]: I0128 19:01:39.549689 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 19:01:39 crc kubenswrapper[4749]: I0128 19:01:39.550826 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 19:01:39 crc kubenswrapper[4749]: I0128 19:01:39.723499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fabdc806-55eb-4e8c-9105-dbbcc33f56ba","Type":"ContainerStarted","Data":"bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef"} Jan 28 19:01:39 crc kubenswrapper[4749]: I0128 19:01:39.726784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"95cae223-462c-417b-b24e-34fb9e61b186","Type":"ContainerStarted","Data":"1ab157bebdeaff160bfae3baecfb76f5b0f4f6f4e2cd594de47ae4cdbc924a40"} Jan 28 19:01:39 crc kubenswrapper[4749]: I0128 19:01:39.743277 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.743256818 podStartE2EDuration="2.743256818s" podCreationTimestamp="2026-01-28 19:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:39.74217667 +0000 UTC m=+1567.753703455" watchObservedRunningTime="2026-01-28 19:01:39.743256818 +0000 UTC m=+1567.754783603" Jan 28 19:01:39 crc kubenswrapper[4749]: I0128 19:01:39.785720 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.785695023 podStartE2EDuration="2.785695023s" podCreationTimestamp="2026-01-28 19:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:39.761917412 +0000 UTC m=+1567.773444197" watchObservedRunningTime="2026-01-28 19:01:39.785695023 +0000 UTC m=+1567.797221798" Jan 28 19:01:40 crc kubenswrapper[4749]: I0128 19:01:40.739226 4749 generic.go:334] "Generic (PLEG): container finished" podID="450c0e30-35b1-49d3-b11d-996b73fb11e8" containerID="532837a3a9279aba0aaf8f5c1ca2cfeb44671c0c41344be9f94f326ca727713b" exitCode=0 Jan 28 19:01:40 crc kubenswrapper[4749]: I0128 19:01:40.739319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jvjll" event={"ID":"450c0e30-35b1-49d3-b11d-996b73fb11e8","Type":"ContainerDied","Data":"532837a3a9279aba0aaf8f5c1ca2cfeb44671c0c41344be9f94f326ca727713b"} Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.183192 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.250896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-config-data\") pod \"450c0e30-35b1-49d3-b11d-996b73fb11e8\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.251026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phdjg\" (UniqueName: \"kubernetes.io/projected/450c0e30-35b1-49d3-b11d-996b73fb11e8-kube-api-access-phdjg\") pod \"450c0e30-35b1-49d3-b11d-996b73fb11e8\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.251152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-scripts\") pod \"450c0e30-35b1-49d3-b11d-996b73fb11e8\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.251625 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-combined-ca-bundle\") pod \"450c0e30-35b1-49d3-b11d-996b73fb11e8\" (UID: \"450c0e30-35b1-49d3-b11d-996b73fb11e8\") " Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.257273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-scripts" (OuterVolumeSpecName: "scripts") pod "450c0e30-35b1-49d3-b11d-996b73fb11e8" (UID: "450c0e30-35b1-49d3-b11d-996b73fb11e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.257360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/450c0e30-35b1-49d3-b11d-996b73fb11e8-kube-api-access-phdjg" (OuterVolumeSpecName: "kube-api-access-phdjg") pod "450c0e30-35b1-49d3-b11d-996b73fb11e8" (UID: "450c0e30-35b1-49d3-b11d-996b73fb11e8"). InnerVolumeSpecName "kube-api-access-phdjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.293378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "450c0e30-35b1-49d3-b11d-996b73fb11e8" (UID: "450c0e30-35b1-49d3-b11d-996b73fb11e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.296669 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-config-data" (OuterVolumeSpecName: "config-data") pod "450c0e30-35b1-49d3-b11d-996b73fb11e8" (UID: "450c0e30-35b1-49d3-b11d-996b73fb11e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.355381 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phdjg\" (UniqueName: \"kubernetes.io/projected/450c0e30-35b1-49d3-b11d-996b73fb11e8-kube-api-access-phdjg\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.355416 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.355427 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.355435 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450c0e30-35b1-49d3-b11d-996b73fb11e8-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.764147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jvjll" event={"ID":"450c0e30-35b1-49d3-b11d-996b73fb11e8","Type":"ContainerDied","Data":"d6650a2cb686bf57ce05b9a537115095e493a40502218b7a663478ca393560da"} Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.764192 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6650a2cb686bf57ce05b9a537115095e493a40502218b7a663478ca393560da" Jan 28 19:01:42 crc kubenswrapper[4749]: I0128 19:01:42.764217 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jvjll" Jan 28 19:01:43 crc kubenswrapper[4749]: I0128 19:01:43.156058 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:43 crc kubenswrapper[4749]: I0128 19:01:43.175286 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.355997 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.356424 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.450241 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 28 19:01:44 crc kubenswrapper[4749]: E0128 19:01:44.450890 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="450c0e30-35b1-49d3-b11d-996b73fb11e8" containerName="aodh-db-sync" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.450927 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="450c0e30-35b1-49d3-b11d-996b73fb11e8" containerName="aodh-db-sync" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.451440 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="450c0e30-35b1-49d3-b11d-996b73fb11e8" containerName="aodh-db-sync" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.461100 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.463848 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.464025 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-qbtm4" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.464140 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.488297 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.530182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-scripts\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.530278 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-config-data\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.530426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.530518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsbpq\" (UniqueName: \"kubernetes.io/projected/f786453c-9456-4fdf-99c2-cd432890a8ea-kube-api-access-xsbpq\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.549546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.549623 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.636896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsbpq\" (UniqueName: \"kubernetes.io/projected/f786453c-9456-4fdf-99c2-cd432890a8ea-kube-api-access-xsbpq\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.637355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-scripts\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.637444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-config-data\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.637657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.646547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-scripts\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.666223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-config-data\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.670544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsbpq\" (UniqueName: \"kubernetes.io/projected/f786453c-9456-4fdf-99c2-cd432890a8ea-kube-api-access-xsbpq\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.670708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " pod="openstack/aodh-0" Jan 28 19:01:44 crc kubenswrapper[4749]: I0128 19:01:44.787393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 28 19:01:45 crc kubenswrapper[4749]: I0128 19:01:45.421481 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 19:01:45 crc kubenswrapper[4749]: I0128 19:01:45.435410 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 28 19:01:45 crc kubenswrapper[4749]: I0128 19:01:45.442531 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:45 crc kubenswrapper[4749]: I0128 19:01:45.442547 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:45 crc kubenswrapper[4749]: I0128 19:01:45.561528 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:45 crc kubenswrapper[4749]: I0128 19:01:45.561843 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:01:45 crc kubenswrapper[4749]: I0128 19:01:45.795410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerStarted","Data":"14f52031bdd931bf3218d9c24fa4dba1bd98443d224e0670437ee269cedc4088"} Jan 28 19:01:45 crc kubenswrapper[4749]: I0128 19:01:45.872497 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:01:45 crc kubenswrapper[4749]: E0128 19:01:45.873108 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.124696 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.125279 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="ceilometer-central-agent" containerID="cri-o://ffacc808180808e02ca1d42c4c565d65b86c70326a0b4615289f58b48af2341a" gracePeriod=30 Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.125459 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="proxy-httpd" containerID="cri-o://5c10778771e3a97f700d66e63d6ae614d1d9bd6e7bb03877e90c4a8c81d9daea" gracePeriod=30 Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.125511 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="sg-core" containerID="cri-o://68549bc3157660e557505e860c2c608e3ec7a934932f49f2696ed18bee1c69e0" gracePeriod=30 Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.125553 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="ceilometer-notification-agent" containerID="cri-o://0d83e4b723af93b84e5f442fdfc2c3dd4a046c0e4a2de26f3fbd019d87d92431" gracePeriod=30 Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.135684 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.251:3000/\": EOF" Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.752456 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.816129 4749 generic.go:334] "Generic (PLEG): container finished" podID="42c65a95-0176-43fc-a216-0bd92417240d" containerID="5c10778771e3a97f700d66e63d6ae614d1d9bd6e7bb03877e90c4a8c81d9daea" exitCode=0 Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.816167 4749 generic.go:334] "Generic (PLEG): container finished" podID="42c65a95-0176-43fc-a216-0bd92417240d" containerID="68549bc3157660e557505e860c2c608e3ec7a934932f49f2696ed18bee1c69e0" exitCode=2 Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.816176 4749 generic.go:334] "Generic (PLEG): container finished" podID="42c65a95-0176-43fc-a216-0bd92417240d" containerID="ffacc808180808e02ca1d42c4c565d65b86c70326a0b4615289f58b48af2341a" exitCode=0 Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.816218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerDied","Data":"5c10778771e3a97f700d66e63d6ae614d1d9bd6e7bb03877e90c4a8c81d9daea"} Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.816836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerDied","Data":"68549bc3157660e557505e860c2c608e3ec7a934932f49f2696ed18bee1c69e0"} Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.816932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerDied","Data":"ffacc808180808e02ca1d42c4c565d65b86c70326a0b4615289f58b48af2341a"} Jan 28 19:01:47 crc kubenswrapper[4749]: I0128 19:01:47.817832 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerStarted","Data":"eed0f7d56885bcbc54f3390b4e518fe9bc665edeabc64ef7ab7859dbfb4fdca8"} Jan 28 19:01:48 crc kubenswrapper[4749]: I0128 19:01:48.155825 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:48 crc kubenswrapper[4749]: I0128 19:01:48.175624 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 28 19:01:48 crc kubenswrapper[4749]: I0128 19:01:48.385951 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 28 19:01:48 crc kubenswrapper[4749]: I0128 19:01:48.403297 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:48 crc kubenswrapper[4749]: I0128 19:01:48.613259 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.251:3000/\": dial tcp 10.217.0.251:3000: connect: connection refused" Jan 28 19:01:48 crc kubenswrapper[4749]: I0128 19:01:48.845464 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 28 19:01:48 crc kubenswrapper[4749]: I0128 19:01:48.885861 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.098042 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-27pt7"] Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.099551 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.101349 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.101415 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.115114 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-27pt7"] Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.158757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-scripts\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.159143 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.159221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr7ct\" (UniqueName: \"kubernetes.io/projected/3f4890e1-bf10-412d-b8b1-00aac05c094c-kube-api-access-kr7ct\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.159352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-config-data\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.261400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-scripts\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.261481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.261561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7ct\" (UniqueName: \"kubernetes.io/projected/3f4890e1-bf10-412d-b8b1-00aac05c094c-kube-api-access-kr7ct\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.261644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-config-data\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.271853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.271902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-config-data\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.274869 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-scripts\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.288262 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr7ct\" (UniqueName: \"kubernetes.io/projected/3f4890e1-bf10-412d-b8b1-00aac05c094c-kube-api-access-kr7ct\") pod \"nova-cell1-cell-mapping-27pt7\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:49 crc kubenswrapper[4749]: I0128 19:01:49.423464 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:01:50 crc kubenswrapper[4749]: I0128 19:01:50.289001 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-27pt7"] Jan 28 19:01:50 crc kubenswrapper[4749]: I0128 19:01:50.849348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-27pt7" event={"ID":"3f4890e1-bf10-412d-b8b1-00aac05c094c","Type":"ContainerStarted","Data":"3ff77d6b76b21cdd0c372020af615789170fcbc1d4f0a3e8932c1ba65d6d57be"} Jan 28 19:01:51 crc kubenswrapper[4749]: I0128 19:01:51.868458 4749 generic.go:334] "Generic (PLEG): container finished" podID="42c65a95-0176-43fc-a216-0bd92417240d" containerID="0d83e4b723af93b84e5f442fdfc2c3dd4a046c0e4a2de26f3fbd019d87d92431" exitCode=0 Jan 28 19:01:51 crc kubenswrapper[4749]: I0128 19:01:51.868769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerDied","Data":"0d83e4b723af93b84e5f442fdfc2c3dd4a046c0e4a2de26f3fbd019d87d92431"} Jan 28 19:01:51 crc kubenswrapper[4749]: I0128 19:01:51.873106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-27pt7" event={"ID":"3f4890e1-bf10-412d-b8b1-00aac05c094c","Type":"ContainerStarted","Data":"f0e2bb50653cae00337342616ac4b2e50a7633314f11d21f518e3bdd282e5e23"} Jan 28 19:01:51 crc kubenswrapper[4749]: I0128 19:01:51.896742 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-27pt7" podStartSLOduration=2.8967229 podStartE2EDuration="2.8967229s" podCreationTimestamp="2026-01-28 19:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:51.893732866 +0000 UTC m=+1579.905259661" watchObservedRunningTime="2026-01-28 19:01:51.8967229 +0000 UTC m=+1579.908249685" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.308846 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.338790 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-config-data\") pod \"42c65a95-0176-43fc-a216-0bd92417240d\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.338864 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-run-httpd\") pod \"42c65a95-0176-43fc-a216-0bd92417240d\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.338919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-log-httpd\") pod \"42c65a95-0176-43fc-a216-0bd92417240d\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.338960 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r56l\" (UniqueName: \"kubernetes.io/projected/42c65a95-0176-43fc-a216-0bd92417240d-kube-api-access-9r56l\") pod \"42c65a95-0176-43fc-a216-0bd92417240d\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.338991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-scripts\") pod \"42c65a95-0176-43fc-a216-0bd92417240d\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.339043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-combined-ca-bundle\") pod \"42c65a95-0176-43fc-a216-0bd92417240d\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.339091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-sg-core-conf-yaml\") pod \"42c65a95-0176-43fc-a216-0bd92417240d\" (UID: \"42c65a95-0176-43fc-a216-0bd92417240d\") " Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.339247 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "42c65a95-0176-43fc-a216-0bd92417240d" (UID: "42c65a95-0176-43fc-a216-0bd92417240d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.339375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "42c65a95-0176-43fc-a216-0bd92417240d" (UID: "42c65a95-0176-43fc-a216-0bd92417240d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.339651 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.339664 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c65a95-0176-43fc-a216-0bd92417240d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.343976 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-scripts" (OuterVolumeSpecName: "scripts") pod "42c65a95-0176-43fc-a216-0bd92417240d" (UID: "42c65a95-0176-43fc-a216-0bd92417240d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.348315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c65a95-0176-43fc-a216-0bd92417240d-kube-api-access-9r56l" (OuterVolumeSpecName: "kube-api-access-9r56l") pod "42c65a95-0176-43fc-a216-0bd92417240d" (UID: "42c65a95-0176-43fc-a216-0bd92417240d"). InnerVolumeSpecName "kube-api-access-9r56l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.389139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "42c65a95-0176-43fc-a216-0bd92417240d" (UID: "42c65a95-0176-43fc-a216-0bd92417240d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.434157 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42c65a95-0176-43fc-a216-0bd92417240d" (UID: "42c65a95-0176-43fc-a216-0bd92417240d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.443481 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r56l\" (UniqueName: \"kubernetes.io/projected/42c65a95-0176-43fc-a216-0bd92417240d-kube-api-access-9r56l\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.443520 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.443547 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.443562 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.474631 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-config-data" (OuterVolumeSpecName: "config-data") pod "42c65a95-0176-43fc-a216-0bd92417240d" (UID: "42c65a95-0176-43fc-a216-0bd92417240d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.546096 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c65a95-0176-43fc-a216-0bd92417240d-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.896156 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.901207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c65a95-0176-43fc-a216-0bd92417240d","Type":"ContainerDied","Data":"1b4fe75607cf510d0001bac3ded684593e3beb30cb0a343e58f7a6064e3ba369"} Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.907011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerStarted","Data":"33b6726d0194209eb1aba47cb3dbfa3af65c51f020b2112de1cf35a5816ca589"} Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.907161 4749 scope.go:117] "RemoveContainer" containerID="5c10778771e3a97f700d66e63d6ae614d1d9bd6e7bb03877e90c4a8c81d9daea" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.943928 4749 scope.go:117] "RemoveContainer" containerID="68549bc3157660e557505e860c2c608e3ec7a934932f49f2696ed18bee1c69e0" Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.948189 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.970489 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:52 crc kubenswrapper[4749]: I0128 19:01:52.978691 4749 scope.go:117] "RemoveContainer" containerID="0d83e4b723af93b84e5f442fdfc2c3dd4a046c0e4a2de26f3fbd019d87d92431" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.015512 4749 scope.go:117] "RemoveContainer" containerID="ffacc808180808e02ca1d42c4c565d65b86c70326a0b4615289f58b48af2341a" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.023439 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:53 crc kubenswrapper[4749]: E0128 19:01:53.024480 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="sg-core" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.024504 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="sg-core" Jan 28 19:01:53 crc kubenswrapper[4749]: E0128 19:01:53.024539 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="proxy-httpd" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.024553 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="proxy-httpd" Jan 28 19:01:53 crc kubenswrapper[4749]: E0128 19:01:53.024586 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="ceilometer-notification-agent" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.024614 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="ceilometer-notification-agent" Jan 28 19:01:53 crc kubenswrapper[4749]: E0128 19:01:53.024668 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="ceilometer-central-agent" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.024678 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="ceilometer-central-agent" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.025215 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="ceilometer-central-agent" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.025253 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="proxy-httpd" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.025293 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="sg-core" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.025341 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c65a95-0176-43fc-a216-0bd92417240d" containerName="ceilometer-notification-agent" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.030168 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.040378 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.041317 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.045465 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.066867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-scripts\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.066919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l644l\" (UniqueName: \"kubernetes.io/projected/95fd1331-84d0-437e-b2d4-baf37ff37ee6-kube-api-access-l644l\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.066956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-config-data\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.066971 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.067034 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.067284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-log-httpd\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.067594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-run-httpd\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.169970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-scripts\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.170219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l644l\" (UniqueName: \"kubernetes.io/projected/95fd1331-84d0-437e-b2d4-baf37ff37ee6-kube-api-access-l644l\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.170364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-config-data\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.170515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.170711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.170872 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-log-httpd\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.170992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-run-httpd\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.171726 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-run-httpd\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.172167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-log-httpd\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.177132 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.177263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-scripts\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.177996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-config-data\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.178983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.186901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l644l\" (UniqueName: \"kubernetes.io/projected/95fd1331-84d0-437e-b2d4-baf37ff37ee6-kube-api-access-l644l\") pod \"ceilometer-0\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.376234 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:01:53 crc kubenswrapper[4749]: I0128 19:01:53.876994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.359271 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.359964 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.360478 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.362833 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.555107 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.560383 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.560729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.884860 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c65a95-0176-43fc-a216-0bd92417240d" path="/var/lib/kubelet/pods/42c65a95-0176-43fc-a216-0bd92417240d/volumes" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.923885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerStarted","Data":"ba67aa13db33078b1d497724e520f674f2d1dcd8f7a6b84e490566b811c821b3"} Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.924360 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.929538 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 19:01:54 crc kubenswrapper[4749]: I0128 19:01:54.929949 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.148814 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs"] Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.152278 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.168158 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs"] Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.266167 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh2cg\" (UniqueName: \"kubernetes.io/projected/5aeab955-536d-4525-adac-04c683c92aeb-kube-api-access-rh2cg\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.266378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.266517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.266601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.266657 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.279529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.382498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh2cg\" (UniqueName: \"kubernetes.io/projected/5aeab955-536d-4525-adac-04c683c92aeb-kube-api-access-rh2cg\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.382594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.382647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.382683 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.382709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.382793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.383815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.384207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.384345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.384344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.388014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aeab955-536d-4525-adac-04c683c92aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.404588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh2cg\" (UniqueName: \"kubernetes.io/projected/5aeab955-536d-4525-adac-04c683c92aeb-kube-api-access-rh2cg\") pod \"dnsmasq-dns-6b7bbf7cf9-hx6rs\" (UID: \"5aeab955-536d-4525-adac-04c683c92aeb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.515874 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.939294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerStarted","Data":"b9f4b75373b095318b63dce94373d54106c3a65c93fe431051563fbf8544ef7e"} Jan 28 19:01:55 crc kubenswrapper[4749]: I0128 19:01:55.950442 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerStarted","Data":"b8476bb192297a5845725a9580be68f5c3d7fca1724c9a5daa24e6e43971e741"} Jan 28 19:01:56 crc kubenswrapper[4749]: W0128 19:01:56.051580 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aeab955_536d_4525_adac_04c683c92aeb.slice/crio-27a12cdfa9d60463878b280a3a6a9d1b0bea79a240a11d666581ae01823eb555 WatchSource:0}: Error finding container 27a12cdfa9d60463878b280a3a6a9d1b0bea79a240a11d666581ae01823eb555: Status 404 returned error can't find the container with id 27a12cdfa9d60463878b280a3a6a9d1b0bea79a240a11d666581ae01823eb555 Jan 28 19:01:56 crc kubenswrapper[4749]: I0128 19:01:56.054145 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs"] Jan 28 19:01:56 crc kubenswrapper[4749]: I0128 19:01:56.964741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerStarted","Data":"a6b9a16eaa978319662ccc045a7e513eea6f0990824769eb498a72e3862e7ba8"} Jan 28 19:01:56 crc kubenswrapper[4749]: I0128 19:01:56.973209 4749 generic.go:334] "Generic (PLEG): container finished" podID="5aeab955-536d-4525-adac-04c683c92aeb" containerID="61920b7e49625990937427589b857b7083425e5d3e7cc3099e4b909d1214a002" exitCode=0 Jan 28 19:01:56 crc kubenswrapper[4749]: I0128 19:01:56.973315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" event={"ID":"5aeab955-536d-4525-adac-04c683c92aeb","Type":"ContainerDied","Data":"61920b7e49625990937427589b857b7083425e5d3e7cc3099e4b909d1214a002"} Jan 28 19:01:56 crc kubenswrapper[4749]: I0128 19:01:56.973372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" event={"ID":"5aeab955-536d-4525-adac-04c683c92aeb","Type":"ContainerStarted","Data":"27a12cdfa9d60463878b280a3a6a9d1b0bea79a240a11d666581ae01823eb555"} Jan 28 19:01:57 crc kubenswrapper[4749]: I0128 19:01:57.988314 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:01:57 crc kubenswrapper[4749]: I0128 19:01:57.992175 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-log" containerID="cri-o://1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1" gracePeriod=30 Jan 28 19:01:57 crc kubenswrapper[4749]: I0128 19:01:57.992301 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-api" containerID="cri-o://6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970" gracePeriod=30 Jan 28 19:01:57 crc kubenswrapper[4749]: I0128 19:01:57.992437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerStarted","Data":"8963a072c3d47596604befd0560d6377c1152bbc1fe156079f414cd039a2bb38"} Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.006812 4749 generic.go:334] "Generic (PLEG): container finished" podID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerID="1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1" exitCode=143 Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.006892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"418dea88-d8b7-43db-8760-6b055d9f6b04","Type":"ContainerDied","Data":"1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1"} Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.010154 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerStarted","Data":"84ba5cd8bdc4ee0ca66aa7f29dd240c9df5caa33635d0694f4b9cd7349bdb1f3"} Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.010363 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-api" containerID="cri-o://eed0f7d56885bcbc54f3390b4e518fe9bc665edeabc64ef7ab7859dbfb4fdca8" gracePeriod=30 Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.010973 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-listener" containerID="cri-o://84ba5cd8bdc4ee0ca66aa7f29dd240c9df5caa33635d0694f4b9cd7349bdb1f3" gracePeriod=30 Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.011037 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-notifier" containerID="cri-o://b8476bb192297a5845725a9580be68f5c3d7fca1724c9a5daa24e6e43971e741" gracePeriod=30 Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.011076 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-evaluator" containerID="cri-o://33b6726d0194209eb1aba47cb3dbfa3af65c51f020b2112de1cf35a5816ca589" gracePeriod=30 Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.014379 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f4890e1-bf10-412d-b8b1-00aac05c094c" containerID="f0e2bb50653cae00337342616ac4b2e50a7633314f11d21f518e3bdd282e5e23" exitCode=0 Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.014450 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-27pt7" event={"ID":"3f4890e1-bf10-412d-b8b1-00aac05c094c","Type":"ContainerDied","Data":"f0e2bb50653cae00337342616ac4b2e50a7633314f11d21f518e3bdd282e5e23"} Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.021111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" event={"ID":"5aeab955-536d-4525-adac-04c683c92aeb","Type":"ContainerStarted","Data":"cf84d2e88b603d805bb7a82e695cbc1b97b4f8461497de3c8710df7b0b7e1248"} Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.022144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.074289 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.130101 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.363350299 podStartE2EDuration="15.130076422s" podCreationTimestamp="2026-01-28 19:01:44 +0000 UTC" firstStartedPulling="2026-01-28 19:01:45.421186215 +0000 UTC m=+1573.432712990" lastFinishedPulling="2026-01-28 19:01:58.187912338 +0000 UTC m=+1586.199439113" observedRunningTime="2026-01-28 19:01:59.039015977 +0000 UTC m=+1587.050542762" watchObservedRunningTime="2026-01-28 19:01:59.130076422 +0000 UTC m=+1587.141603197" Jan 28 19:01:59 crc kubenswrapper[4749]: I0128 19:01:59.157578 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" podStartSLOduration=4.157559627 podStartE2EDuration="4.157559627s" podCreationTimestamp="2026-01-28 19:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:01:59.115636513 +0000 UTC m=+1587.127163308" watchObservedRunningTime="2026-01-28 19:01:59.157559627 +0000 UTC m=+1587.169086402" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.034019 4749 generic.go:334] "Generic (PLEG): container finished" podID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerID="33b6726d0194209eb1aba47cb3dbfa3af65c51f020b2112de1cf35a5816ca589" exitCode=0 Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.034050 4749 generic.go:334] "Generic (PLEG): container finished" podID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerID="eed0f7d56885bcbc54f3390b4e518fe9bc665edeabc64ef7ab7859dbfb4fdca8" exitCode=0 Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.034097 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerDied","Data":"33b6726d0194209eb1aba47cb3dbfa3af65c51f020b2112de1cf35a5816ca589"} Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.034152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerDied","Data":"eed0f7d56885bcbc54f3390b4e518fe9bc665edeabc64ef7ab7859dbfb4fdca8"} Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.599757 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.742280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-config-data\") pod \"3f4890e1-bf10-412d-b8b1-00aac05c094c\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.742661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-combined-ca-bundle\") pod \"3f4890e1-bf10-412d-b8b1-00aac05c094c\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.742768 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr7ct\" (UniqueName: \"kubernetes.io/projected/3f4890e1-bf10-412d-b8b1-00aac05c094c-kube-api-access-kr7ct\") pod \"3f4890e1-bf10-412d-b8b1-00aac05c094c\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.742883 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-scripts\") pod \"3f4890e1-bf10-412d-b8b1-00aac05c094c\" (UID: \"3f4890e1-bf10-412d-b8b1-00aac05c094c\") " Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.754395 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-scripts" (OuterVolumeSpecName: "scripts") pod "3f4890e1-bf10-412d-b8b1-00aac05c094c" (UID: "3f4890e1-bf10-412d-b8b1-00aac05c094c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.755039 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4890e1-bf10-412d-b8b1-00aac05c094c-kube-api-access-kr7ct" (OuterVolumeSpecName: "kube-api-access-kr7ct") pod "3f4890e1-bf10-412d-b8b1-00aac05c094c" (UID: "3f4890e1-bf10-412d-b8b1-00aac05c094c"). InnerVolumeSpecName "kube-api-access-kr7ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.779087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-config-data" (OuterVolumeSpecName: "config-data") pod "3f4890e1-bf10-412d-b8b1-00aac05c094c" (UID: "3f4890e1-bf10-412d-b8b1-00aac05c094c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.792516 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f4890e1-bf10-412d-b8b1-00aac05c094c" (UID: "3f4890e1-bf10-412d-b8b1-00aac05c094c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.847021 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr7ct\" (UniqueName: \"kubernetes.io/projected/3f4890e1-bf10-412d-b8b1-00aac05c094c-kube-api-access-kr7ct\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.847051 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.847064 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.847082 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4890e1-bf10-412d-b8b1-00aac05c094c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:00 crc kubenswrapper[4749]: I0128 19:02:00.872577 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:02:00 crc kubenswrapper[4749]: E0128 19:02:00.872863 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.046110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerStarted","Data":"578e79cb7c64f8a4c32c1206751c9be1ce184b4594111f1ac6e3183ede95ecba"} Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.047171 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.046462 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="proxy-httpd" containerID="cri-o://578e79cb7c64f8a4c32c1206751c9be1ce184b4594111f1ac6e3183ede95ecba" gracePeriod=30 Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.046476 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="sg-core" containerID="cri-o://8963a072c3d47596604befd0560d6377c1152bbc1fe156079f414cd039a2bb38" gracePeriod=30 Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.046487 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="ceilometer-notification-agent" containerID="cri-o://a6b9a16eaa978319662ccc045a7e513eea6f0990824769eb498a72e3862e7ba8" gracePeriod=30 Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.046235 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="ceilometer-central-agent" containerID="cri-o://b9f4b75373b095318b63dce94373d54106c3a65c93fe431051563fbf8544ef7e" gracePeriod=30 Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.050854 4749 generic.go:334] "Generic (PLEG): container finished" podID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerID="b8476bb192297a5845725a9580be68f5c3d7fca1724c9a5daa24e6e43971e741" exitCode=0 Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.050889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerDied","Data":"b8476bb192297a5845725a9580be68f5c3d7fca1724c9a5daa24e6e43971e741"} Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.053884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-27pt7" event={"ID":"3f4890e1-bf10-412d-b8b1-00aac05c094c","Type":"ContainerDied","Data":"3ff77d6b76b21cdd0c372020af615789170fcbc1d4f0a3e8932c1ba65d6d57be"} Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.053913 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ff77d6b76b21cdd0c372020af615789170fcbc1d4f0a3e8932c1ba65d6d57be" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.053944 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-27pt7" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.078166 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.007914609 podStartE2EDuration="9.078147408s" podCreationTimestamp="2026-01-28 19:01:52 +0000 UTC" firstStartedPulling="2026-01-28 19:01:54.118309502 +0000 UTC m=+1582.129836287" lastFinishedPulling="2026-01-28 19:02:00.188542321 +0000 UTC m=+1588.200069086" observedRunningTime="2026-01-28 19:02:01.065843012 +0000 UTC m=+1589.077369787" watchObservedRunningTime="2026-01-28 19:02:01.078147408 +0000 UTC m=+1589.089674183" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.272213 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.272562 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fabdc806-55eb-4e8c-9105-dbbcc33f56ba" containerName="nova-scheduler-scheduler" containerID="cri-o://bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef" gracePeriod=30 Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.293808 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.294084 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-log" containerID="cri-o://639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07" gracePeriod=30 Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.294224 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-metadata" containerID="cri-o://f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34" gracePeriod=30 Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.822298 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.871399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txbmk\" (UniqueName: \"kubernetes.io/projected/418dea88-d8b7-43db-8760-6b055d9f6b04-kube-api-access-txbmk\") pod \"418dea88-d8b7-43db-8760-6b055d9f6b04\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.871511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-combined-ca-bundle\") pod \"418dea88-d8b7-43db-8760-6b055d9f6b04\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.871593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/418dea88-d8b7-43db-8760-6b055d9f6b04-logs\") pod \"418dea88-d8b7-43db-8760-6b055d9f6b04\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.871731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-config-data\") pod \"418dea88-d8b7-43db-8760-6b055d9f6b04\" (UID: \"418dea88-d8b7-43db-8760-6b055d9f6b04\") " Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.873010 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/418dea88-d8b7-43db-8760-6b055d9f6b04-logs" (OuterVolumeSpecName: "logs") pod "418dea88-d8b7-43db-8760-6b055d9f6b04" (UID: "418dea88-d8b7-43db-8760-6b055d9f6b04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.877024 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418dea88-d8b7-43db-8760-6b055d9f6b04-kube-api-access-txbmk" (OuterVolumeSpecName: "kube-api-access-txbmk") pod "418dea88-d8b7-43db-8760-6b055d9f6b04" (UID: "418dea88-d8b7-43db-8760-6b055d9f6b04"). InnerVolumeSpecName "kube-api-access-txbmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.924079 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "418dea88-d8b7-43db-8760-6b055d9f6b04" (UID: "418dea88-d8b7-43db-8760-6b055d9f6b04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.932044 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-config-data" (OuterVolumeSpecName: "config-data") pod "418dea88-d8b7-43db-8760-6b055d9f6b04" (UID: "418dea88-d8b7-43db-8760-6b055d9f6b04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.974414 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.974460 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/418dea88-d8b7-43db-8760-6b055d9f6b04-logs\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.974474 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/418dea88-d8b7-43db-8760-6b055d9f6b04-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:01 crc kubenswrapper[4749]: I0128 19:02:01.974487 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txbmk\" (UniqueName: \"kubernetes.io/projected/418dea88-d8b7-43db-8760-6b055d9f6b04-kube-api-access-txbmk\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.066983 4749 generic.go:334] "Generic (PLEG): container finished" podID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerID="578e79cb7c64f8a4c32c1206751c9be1ce184b4594111f1ac6e3183ede95ecba" exitCode=0 Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.067019 4749 generic.go:334] "Generic (PLEG): container finished" podID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerID="8963a072c3d47596604befd0560d6377c1152bbc1fe156079f414cd039a2bb38" exitCode=2 Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.067026 4749 generic.go:334] "Generic (PLEG): container finished" podID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerID="a6b9a16eaa978319662ccc045a7e513eea6f0990824769eb498a72e3862e7ba8" exitCode=0 Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.067048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerDied","Data":"578e79cb7c64f8a4c32c1206751c9be1ce184b4594111f1ac6e3183ede95ecba"} Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.067092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerDied","Data":"8963a072c3d47596604befd0560d6377c1152bbc1fe156079f414cd039a2bb38"} Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.067104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerDied","Data":"a6b9a16eaa978319662ccc045a7e513eea6f0990824769eb498a72e3862e7ba8"} Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.069661 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerID="639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07" exitCode=143 Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.069733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d411e3b-b35c-4f9b-8463-4b18218c6693","Type":"ContainerDied","Data":"639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07"} Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.071449 4749 generic.go:334] "Generic (PLEG): container finished" podID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerID="6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970" exitCode=0 Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.071480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"418dea88-d8b7-43db-8760-6b055d9f6b04","Type":"ContainerDied","Data":"6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970"} Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.071497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"418dea88-d8b7-43db-8760-6b055d9f6b04","Type":"ContainerDied","Data":"30735ad05c23c0e5a5c8e5663f37b8f977fd678d81c619bf6173c65bdde2d411"} Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.071514 4749 scope.go:117] "RemoveContainer" containerID="6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.071680 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.103597 4749 scope.go:117] "RemoveContainer" containerID="1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.110657 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.121685 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.127736 4749 scope.go:117] "RemoveContainer" containerID="6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970" Jan 28 19:02:02 crc kubenswrapper[4749]: E0128 19:02:02.128449 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970\": container with ID starting with 6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970 not found: ID does not exist" containerID="6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.128566 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970"} err="failed to get container status \"6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970\": rpc error: code = NotFound desc = could not find container \"6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970\": container with ID starting with 6722efa60d06043b945b5f442cac6ad795368c505c7ddab5a539ec03fc654970 not found: ID does not exist" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.128638 4749 scope.go:117] "RemoveContainer" containerID="1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1" Jan 28 19:02:02 crc kubenswrapper[4749]: E0128 19:02:02.129056 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1\": container with ID starting with 1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1 not found: ID does not exist" containerID="1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.129105 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1"} err="failed to get container status \"1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1\": rpc error: code = NotFound desc = could not find container \"1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1\": container with ID starting with 1eaf5b02b06fbd0bdc197c1b6340508278671b23e4ee7f5fcfd51b038abb55d1 not found: ID does not exist" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.133261 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 19:02:02 crc kubenswrapper[4749]: E0128 19:02:02.133902 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-log" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.134291 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-log" Jan 28 19:02:02 crc kubenswrapper[4749]: E0128 19:02:02.134380 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-api" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.134434 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-api" Jan 28 19:02:02 crc kubenswrapper[4749]: E0128 19:02:02.134495 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4890e1-bf10-412d-b8b1-00aac05c094c" containerName="nova-manage" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.134552 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4890e1-bf10-412d-b8b1-00aac05c094c" containerName="nova-manage" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.134827 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4890e1-bf10-412d-b8b1-00aac05c094c" containerName="nova-manage" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.134893 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-api" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.134962 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" containerName="nova-api-log" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.136185 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.144804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.173683 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.174053 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.174189 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.179659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxxbw\" (UniqueName: \"kubernetes.io/projected/31f23a26-7126-4b08-9616-49b0541bff3c-kube-api-access-mxxbw\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.179740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-config-data\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.179929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-public-tls-certs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.180015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.180067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.180172 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31f23a26-7126-4b08-9616-49b0541bff3c-logs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.282426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxxbw\" (UniqueName: \"kubernetes.io/projected/31f23a26-7126-4b08-9616-49b0541bff3c-kube-api-access-mxxbw\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.282500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-config-data\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.282570 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-public-tls-certs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.282597 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.282617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.282666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31f23a26-7126-4b08-9616-49b0541bff3c-logs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.283311 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31f23a26-7126-4b08-9616-49b0541bff3c-logs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.286421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-config-data\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.286481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.286717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.287149 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f23a26-7126-4b08-9616-49b0541bff3c-public-tls-certs\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.301004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxxbw\" (UniqueName: \"kubernetes.io/projected/31f23a26-7126-4b08-9616-49b0541bff3c-kube-api-access-mxxbw\") pod \"nova-api-0\" (UID: \"31f23a26-7126-4b08-9616-49b0541bff3c\") " pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.491499 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 19:02:02 crc kubenswrapper[4749]: I0128 19:02:02.893435 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418dea88-d8b7-43db-8760-6b055d9f6b04" path="/var/lib/kubelet/pods/418dea88-d8b7-43db-8760-6b055d9f6b04/volumes" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.015660 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.100213 4749 generic.go:334] "Generic (PLEG): container finished" podID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerID="b9f4b75373b095318b63dce94373d54106c3a65c93fe431051563fbf8544ef7e" exitCode=0 Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.100369 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerDied","Data":"b9f4b75373b095318b63dce94373d54106c3a65c93fe431051563fbf8544ef7e"} Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.102093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31f23a26-7126-4b08-9616-49b0541bff3c","Type":"ContainerStarted","Data":"33a8f1e25e1bea6a279a96db8bf6215be5f71373187b3f223698140e27ea4878"} Jan 28 19:02:03 crc kubenswrapper[4749]: E0128 19:02:03.180835 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 19:02:03 crc kubenswrapper[4749]: E0128 19:02:03.184635 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 19:02:03 crc kubenswrapper[4749]: E0128 19:02:03.187911 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 19:02:03 crc kubenswrapper[4749]: E0128 19:02:03.187963 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fabdc806-55eb-4e8c-9105-dbbcc33f56ba" containerName="nova-scheduler-scheduler" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.198235 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.225420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-config-data\") pod \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.225530 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l644l\" (UniqueName: \"kubernetes.io/projected/95fd1331-84d0-437e-b2d4-baf37ff37ee6-kube-api-access-l644l\") pod \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.225556 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-sg-core-conf-yaml\") pod \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.225583 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-combined-ca-bundle\") pod \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.225741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-log-httpd\") pod \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.225940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-run-httpd\") pod \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.225995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-scripts\") pod \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\" (UID: \"95fd1331-84d0-437e-b2d4-baf37ff37ee6\") " Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.238149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95fd1331-84d0-437e-b2d4-baf37ff37ee6" (UID: "95fd1331-84d0-437e-b2d4-baf37ff37ee6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.239171 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95fd1331-84d0-437e-b2d4-baf37ff37ee6" (UID: "95fd1331-84d0-437e-b2d4-baf37ff37ee6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.272882 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-scripts" (OuterVolumeSpecName: "scripts") pod "95fd1331-84d0-437e-b2d4-baf37ff37ee6" (UID: "95fd1331-84d0-437e-b2d4-baf37ff37ee6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.273088 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fd1331-84d0-437e-b2d4-baf37ff37ee6-kube-api-access-l644l" (OuterVolumeSpecName: "kube-api-access-l644l") pod "95fd1331-84d0-437e-b2d4-baf37ff37ee6" (UID: "95fd1331-84d0-437e-b2d4-baf37ff37ee6"). InnerVolumeSpecName "kube-api-access-l644l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.277204 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95fd1331-84d0-437e-b2d4-baf37ff37ee6" (UID: "95fd1331-84d0-437e-b2d4-baf37ff37ee6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.332002 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.332042 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fd1331-84d0-437e-b2d4-baf37ff37ee6-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.332051 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.332060 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l644l\" (UniqueName: \"kubernetes.io/projected/95fd1331-84d0-437e-b2d4-baf37ff37ee6-kube-api-access-l644l\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.332071 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.364374 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95fd1331-84d0-437e-b2d4-baf37ff37ee6" (UID: "95fd1331-84d0-437e-b2d4-baf37ff37ee6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.398557 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-config-data" (OuterVolumeSpecName: "config-data") pod "95fd1331-84d0-437e-b2d4-baf37ff37ee6" (UID: "95fd1331-84d0-437e-b2d4-baf37ff37ee6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.434490 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:03 crc kubenswrapper[4749]: I0128 19:02:03.434526 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fd1331-84d0-437e-b2d4-baf37ff37ee6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.114125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31f23a26-7126-4b08-9616-49b0541bff3c","Type":"ContainerStarted","Data":"4ae73fc417f011c080df2f5b7204dbf158ac627fe047ae5f93194a8e33258047"} Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.114610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31f23a26-7126-4b08-9616-49b0541bff3c","Type":"ContainerStarted","Data":"3120172bd92dd77644f48673a6f1c1a977a425ea801824f911aca7466abc5025"} Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.116740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fd1331-84d0-437e-b2d4-baf37ff37ee6","Type":"ContainerDied","Data":"ba67aa13db33078b1d497724e520f674f2d1dcd8f7a6b84e490566b811c821b3"} Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.116785 4749 scope.go:117] "RemoveContainer" containerID="578e79cb7c64f8a4c32c1206751c9be1ce184b4594111f1ac6e3183ede95ecba" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.116822 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.137558 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.137539247 podStartE2EDuration="2.137539247s" podCreationTimestamp="2026-01-28 19:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:02:04.130765688 +0000 UTC m=+1592.142292483" watchObservedRunningTime="2026-01-28 19:02:04.137539247 +0000 UTC m=+1592.149066032" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.156414 4749 scope.go:117] "RemoveContainer" containerID="8963a072c3d47596604befd0560d6377c1152bbc1fe156079f414cd039a2bb38" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.192665 4749 scope.go:117] "RemoveContainer" containerID="a6b9a16eaa978319662ccc045a7e513eea6f0990824769eb498a72e3862e7ba8" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.195890 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.218467 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.218885 4749 scope.go:117] "RemoveContainer" containerID="b9f4b75373b095318b63dce94373d54106c3a65c93fe431051563fbf8544ef7e" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.232187 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:04 crc kubenswrapper[4749]: E0128 19:02:04.232922 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="proxy-httpd" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.232942 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="proxy-httpd" Jan 28 19:02:04 crc kubenswrapper[4749]: E0128 19:02:04.232965 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="ceilometer-notification-agent" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.232975 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="ceilometer-notification-agent" Jan 28 19:02:04 crc kubenswrapper[4749]: E0128 19:02:04.232989 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="ceilometer-central-agent" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.232999 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="ceilometer-central-agent" Jan 28 19:02:04 crc kubenswrapper[4749]: E0128 19:02:04.233038 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="sg-core" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.233047 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="sg-core" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.233348 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="ceilometer-notification-agent" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.233374 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="sg-core" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.233393 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="proxy-httpd" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.233416 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" containerName="ceilometer-central-agent" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.238442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.241542 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.241743 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.255873 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.274089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.274152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-log-httpd\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.274178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-run-httpd\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.274226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.274328 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-config-data\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.274432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjb28\" (UniqueName: \"kubernetes.io/projected/e4a6a301-409d-4642-a4b6-c15b2fc03c84-kube-api-access-fjb28\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.274454 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-scripts\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.376121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-log-httpd\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.376169 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-run-httpd\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.376223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.376292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-config-data\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.376419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjb28\" (UniqueName: \"kubernetes.io/projected/e4a6a301-409d-4642-a4b6-c15b2fc03c84-kube-api-access-fjb28\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.376443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-scripts\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.376474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.398176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-run-httpd\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.398547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-log-httpd\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.401636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.403217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-scripts\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.404845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.420277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjb28\" (UniqueName: \"kubernetes.io/projected/e4a6a301-409d-4642-a4b6-c15b2fc03c84-kube-api-access-fjb28\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.436187 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-config-data\") pod \"ceilometer-0\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.550001 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": dial tcp 10.217.0.254:8775: connect: connection refused" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.550048 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": dial tcp 10.217.0.254:8775: connect: connection refused" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.571625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.893523 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fd1331-84d0-437e-b2d4-baf37ff37ee6" path="/var/lib/kubelet/pods/95fd1331-84d0-437e-b2d4-baf37ff37ee6/volumes" Jan 28 19:02:04 crc kubenswrapper[4749]: I0128 19:02:04.995180 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.109054 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-config-data\") pod \"9d411e3b-b35c-4f9b-8463-4b18218c6693\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.109110 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.109154 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-combined-ca-bundle\") pod \"9d411e3b-b35c-4f9b-8463-4b18218c6693\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.109284 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d411e3b-b35c-4f9b-8463-4b18218c6693-logs\") pod \"9d411e3b-b35c-4f9b-8463-4b18218c6693\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.109327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-nova-metadata-tls-certs\") pod \"9d411e3b-b35c-4f9b-8463-4b18218c6693\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.109370 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg794\" (UniqueName: \"kubernetes.io/projected/9d411e3b-b35c-4f9b-8463-4b18218c6693-kube-api-access-sg794\") pod \"9d411e3b-b35c-4f9b-8463-4b18218c6693\" (UID: \"9d411e3b-b35c-4f9b-8463-4b18218c6693\") " Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.111149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d411e3b-b35c-4f9b-8463-4b18218c6693-logs" (OuterVolumeSpecName: "logs") pod "9d411e3b-b35c-4f9b-8463-4b18218c6693" (UID: "9d411e3b-b35c-4f9b-8463-4b18218c6693"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.115887 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d411e3b-b35c-4f9b-8463-4b18218c6693-kube-api-access-sg794" (OuterVolumeSpecName: "kube-api-access-sg794") pod "9d411e3b-b35c-4f9b-8463-4b18218c6693" (UID: "9d411e3b-b35c-4f9b-8463-4b18218c6693"). InnerVolumeSpecName "kube-api-access-sg794". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.130631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerStarted","Data":"1844a219bf8f001899baaf4503ba8ecfd9128d2d7dad790331a4dc0dad7b0fbd"} Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.133507 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerID="f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34" exitCode=0 Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.133547 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.133610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d411e3b-b35c-4f9b-8463-4b18218c6693","Type":"ContainerDied","Data":"f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34"} Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.133703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d411e3b-b35c-4f9b-8463-4b18218c6693","Type":"ContainerDied","Data":"bd55eced5d3325a9e6bf16e3aa31de225a8fbeeba783594651957556ab7a9747"} Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.133727 4749 scope.go:117] "RemoveContainer" containerID="f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.149996 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d411e3b-b35c-4f9b-8463-4b18218c6693" (UID: "9d411e3b-b35c-4f9b-8463-4b18218c6693"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.154943 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-config-data" (OuterVolumeSpecName: "config-data") pod "9d411e3b-b35c-4f9b-8463-4b18218c6693" (UID: "9d411e3b-b35c-4f9b-8463-4b18218c6693"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.171820 4749 scope.go:117] "RemoveContainer" containerID="639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.191991 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9d411e3b-b35c-4f9b-8463-4b18218c6693" (UID: "9d411e3b-b35c-4f9b-8463-4b18218c6693"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.192269 4749 scope.go:117] "RemoveContainer" containerID="f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34" Jan 28 19:02:05 crc kubenswrapper[4749]: E0128 19:02:05.192689 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34\": container with ID starting with f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34 not found: ID does not exist" containerID="f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.192734 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34"} err="failed to get container status \"f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34\": rpc error: code = NotFound desc = could not find container \"f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34\": container with ID starting with f57c02ec068c88063b4444e9d99a4502a4059f075ece7d0bc089cb351f2ddf34 not found: ID does not exist" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.192767 4749 scope.go:117] "RemoveContainer" containerID="639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07" Jan 28 19:02:05 crc kubenswrapper[4749]: E0128 19:02:05.193088 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07\": container with ID starting with 639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07 not found: ID does not exist" containerID="639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.193113 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07"} err="failed to get container status \"639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07\": rpc error: code = NotFound desc = could not find container \"639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07\": container with ID starting with 639a14de926eead7fbcb98c8fd78eb2eb94957405b0fdbb8940a90efcae99b07 not found: ID does not exist" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.212502 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.212531 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.212541 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d411e3b-b35c-4f9b-8463-4b18218c6693-logs\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.212549 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d411e3b-b35c-4f9b-8463-4b18218c6693-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.212558 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg794\" (UniqueName: \"kubernetes.io/projected/9d411e3b-b35c-4f9b-8463-4b18218c6693-kube-api-access-sg794\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.477458 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.492797 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.507620 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:02:05 crc kubenswrapper[4749]: E0128 19:02:05.508095 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-log" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.508112 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-log" Jan 28 19:02:05 crc kubenswrapper[4749]: E0128 19:02:05.508149 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-metadata" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.508156 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-metadata" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.508428 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-log" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.508445 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" containerName="nova-metadata-metadata" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.509711 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.517546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hx6rs" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.518456 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.520580 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.534268 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.620992 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mc9h\" (UniqueName: \"kubernetes.io/projected/8f9fb94c-26de-427c-bd82-919875c80787-kube-api-access-4mc9h\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.621140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.621243 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-config-data\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.621277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.621421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9fb94c-26de-427c-bd82-919875c80787-logs\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.694902 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-mf4jk"] Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.695933 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" podUID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" containerName="dnsmasq-dns" containerID="cri-o://a9398e22e233fff5ed277aec93ec36e8fd610c432a1ce0b6b86871b0aca3f3d1" gracePeriod=10 Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.723875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mc9h\" (UniqueName: \"kubernetes.io/projected/8f9fb94c-26de-427c-bd82-919875c80787-kube-api-access-4mc9h\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.724194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.724495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-config-data\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.724614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.724837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9fb94c-26de-427c-bd82-919875c80787-logs\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.725966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9fb94c-26de-427c-bd82-919875c80787-logs\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.736709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-config-data\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.737154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.737939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9fb94c-26de-427c-bd82-919875c80787-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.757017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mc9h\" (UniqueName: \"kubernetes.io/projected/8f9fb94c-26de-427c-bd82-919875c80787-kube-api-access-4mc9h\") pod \"nova-metadata-0\" (UID: \"8f9fb94c-26de-427c-bd82-919875c80787\") " pod="openstack/nova-metadata-0" Jan 28 19:02:05 crc kubenswrapper[4749]: I0128 19:02:05.891153 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.170721 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" containerID="a9398e22e233fff5ed277aec93ec36e8fd610c432a1ce0b6b86871b0aca3f3d1" exitCode=0 Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.171007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" event={"ID":"9ec54373-337d-4d3f-a3ae-1b5be5f892b4","Type":"ContainerDied","Data":"a9398e22e233fff5ed277aec93ec36e8fd610c432a1ce0b6b86871b0aca3f3d1"} Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.173189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerStarted","Data":"2a1cb1a67ce4979b089a5974362ea41e1f90517846d058aeaff95230cf2787e9"} Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.237885 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.347159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-svc\") pod \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.347404 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld62j\" (UniqueName: \"kubernetes.io/projected/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-kube-api-access-ld62j\") pod \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.347436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-config\") pod \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.347499 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-nb\") pod \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.347559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-sb\") pod \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.347614 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-swift-storage-0\") pod \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\" (UID: \"9ec54373-337d-4d3f-a3ae-1b5be5f892b4\") " Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.354200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-kube-api-access-ld62j" (OuterVolumeSpecName: "kube-api-access-ld62j") pod "9ec54373-337d-4d3f-a3ae-1b5be5f892b4" (UID: "9ec54373-337d-4d3f-a3ae-1b5be5f892b4"). InnerVolumeSpecName "kube-api-access-ld62j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.424955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ec54373-337d-4d3f-a3ae-1b5be5f892b4" (UID: "9ec54373-337d-4d3f-a3ae-1b5be5f892b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.448931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-config" (OuterVolumeSpecName: "config") pod "9ec54373-337d-4d3f-a3ae-1b5be5f892b4" (UID: "9ec54373-337d-4d3f-a3ae-1b5be5f892b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.450580 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.450601 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld62j\" (UniqueName: \"kubernetes.io/projected/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-kube-api-access-ld62j\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.450612 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-config\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.451692 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ec54373-337d-4d3f-a3ae-1b5be5f892b4" (UID: "9ec54373-337d-4d3f-a3ae-1b5be5f892b4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.459516 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ec54373-337d-4d3f-a3ae-1b5be5f892b4" (UID: "9ec54373-337d-4d3f-a3ae-1b5be5f892b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.467277 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ec54373-337d-4d3f-a3ae-1b5be5f892b4" (UID: "9ec54373-337d-4d3f-a3ae-1b5be5f892b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.549242 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.552660 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.553622 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.553686 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ec54373-337d-4d3f-a3ae-1b5be5f892b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.895378 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d411e3b-b35c-4f9b-8463-4b18218c6693" path="/var/lib/kubelet/pods/9d411e3b-b35c-4f9b-8463-4b18218c6693/volumes" Jan 28 19:02:06 crc kubenswrapper[4749]: I0128 19:02:06.961157 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.068821 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-combined-ca-bundle\") pod \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.069208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-config-data\") pod \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.069758 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j7h6\" (UniqueName: \"kubernetes.io/projected/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-kube-api-access-7j7h6\") pod \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\" (UID: \"fabdc806-55eb-4e8c-9105-dbbcc33f56ba\") " Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.079631 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-kube-api-access-7j7h6" (OuterVolumeSpecName: "kube-api-access-7j7h6") pod "fabdc806-55eb-4e8c-9105-dbbcc33f56ba" (UID: "fabdc806-55eb-4e8c-9105-dbbcc33f56ba"). InnerVolumeSpecName "kube-api-access-7j7h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.112458 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fabdc806-55eb-4e8c-9105-dbbcc33f56ba" (UID: "fabdc806-55eb-4e8c-9105-dbbcc33f56ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.117230 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-config-data" (OuterVolumeSpecName: "config-data") pod "fabdc806-55eb-4e8c-9105-dbbcc33f56ba" (UID: "fabdc806-55eb-4e8c-9105-dbbcc33f56ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.173062 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.173098 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.173110 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j7h6\" (UniqueName: \"kubernetes.io/projected/fabdc806-55eb-4e8c-9105-dbbcc33f56ba-kube-api-access-7j7h6\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.194808 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" event={"ID":"9ec54373-337d-4d3f-a3ae-1b5be5f892b4","Type":"ContainerDied","Data":"079985dc57a5ac338c003cf1ec6d21f818e7ea54bad88983a7ff898123c6a98b"} Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.194870 4749 scope.go:117] "RemoveContainer" containerID="a9398e22e233fff5ed277aec93ec36e8fd610c432a1ce0b6b86871b0aca3f3d1" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.195159 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-mf4jk" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.197588 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerStarted","Data":"ceb2aa329c93c31ba7a2594a4472562031874d7fc4bed2092cdb6f04400c4b2e"} Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.200730 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f9fb94c-26de-427c-bd82-919875c80787","Type":"ContainerStarted","Data":"c798bf1065c76018a3ab8a7e30f6039e26c1cd6d6c92a380e42d6df37871902f"} Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.200770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f9fb94c-26de-427c-bd82-919875c80787","Type":"ContainerStarted","Data":"1941631a6f04627f51b6662326a350f0187f0bd9febbcac72de9d25cd1fe9f9e"} Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.200783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f9fb94c-26de-427c-bd82-919875c80787","Type":"ContainerStarted","Data":"9b2e9da187f69599af6498e7047cd6606cd084cc6c0587b3b8f6d50a6af41b90"} Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.202386 4749 generic.go:334] "Generic (PLEG): container finished" podID="fabdc806-55eb-4e8c-9105-dbbcc33f56ba" containerID="bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef" exitCode=0 Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.202436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fabdc806-55eb-4e8c-9105-dbbcc33f56ba","Type":"ContainerDied","Data":"bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef"} Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.202461 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fabdc806-55eb-4e8c-9105-dbbcc33f56ba","Type":"ContainerDied","Data":"f40f6c2f7d51587a845fc8cc29f7562d4934cccea4c0e88d056bcbd582ba3a8e"} Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.202465 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.226101 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-mf4jk"] Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.229758 4749 scope.go:117] "RemoveContainer" containerID="33f39a50bdc71171f16ed7433a471255c2d2ca50978e9d46933bc80fdc56239c" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.248507 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-mf4jk"] Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.253285 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.2532617569999998 podStartE2EDuration="2.253261757s" podCreationTimestamp="2026-01-28 19:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:02:07.234985242 +0000 UTC m=+1595.246512027" watchObservedRunningTime="2026-01-28 19:02:07.253261757 +0000 UTC m=+1595.264788532" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.273299 4749 scope.go:117] "RemoveContainer" containerID="bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.315814 4749 scope.go:117] "RemoveContainer" containerID="bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.315951 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:02:07 crc kubenswrapper[4749]: E0128 19:02:07.317868 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef\": container with ID starting with bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef not found: ID does not exist" containerID="bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.318071 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef"} err="failed to get container status \"bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef\": rpc error: code = NotFound desc = could not find container \"bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef\": container with ID starting with bb21fc20b2bc82f94bb218a5ce7a855ff93441900c9e78f944507312a79ec5ef not found: ID does not exist" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.345135 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.361012 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:02:07 crc kubenswrapper[4749]: E0128 19:02:07.361946 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" containerName="init" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.362052 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" containerName="init" Jan 28 19:02:07 crc kubenswrapper[4749]: E0128 19:02:07.362156 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabdc806-55eb-4e8c-9105-dbbcc33f56ba" containerName="nova-scheduler-scheduler" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.362209 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabdc806-55eb-4e8c-9105-dbbcc33f56ba" containerName="nova-scheduler-scheduler" Jan 28 19:02:07 crc kubenswrapper[4749]: E0128 19:02:07.362271 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" containerName="dnsmasq-dns" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.362318 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" containerName="dnsmasq-dns" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.362764 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabdc806-55eb-4e8c-9105-dbbcc33f56ba" containerName="nova-scheduler-scheduler" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.362859 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" containerName="dnsmasq-dns" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.363843 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.372424 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.379729 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.479995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.480075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-config-data\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.480199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg2r5\" (UniqueName: \"kubernetes.io/projected/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-kube-api-access-jg2r5\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.582490 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.582777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-config-data\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.582888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg2r5\" (UniqueName: \"kubernetes.io/projected/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-kube-api-access-jg2r5\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.586751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.591889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-config-data\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.603953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg2r5\" (UniqueName: \"kubernetes.io/projected/62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e-kube-api-access-jg2r5\") pod \"nova-scheduler-0\" (UID: \"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e\") " pod="openstack/nova-scheduler-0" Jan 28 19:02:07 crc kubenswrapper[4749]: I0128 19:02:07.692083 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 19:02:08 crc kubenswrapper[4749]: W0128 19:02:08.238725 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62d97098_ddeb_4da3_9c51_2b7eb9a5cd4e.slice/crio-3fcf3cc5add5111893c3d87e916fc84d5c5f820fff036b051a182b6d5993742e WatchSource:0}: Error finding container 3fcf3cc5add5111893c3d87e916fc84d5c5f820fff036b051a182b6d5993742e: Status 404 returned error can't find the container with id 3fcf3cc5add5111893c3d87e916fc84d5c5f820fff036b051a182b6d5993742e Jan 28 19:02:08 crc kubenswrapper[4749]: I0128 19:02:08.240929 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 19:02:08 crc kubenswrapper[4749]: I0128 19:02:08.884048 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec54373-337d-4d3f-a3ae-1b5be5f892b4" path="/var/lib/kubelet/pods/9ec54373-337d-4d3f-a3ae-1b5be5f892b4/volumes" Jan 28 19:02:08 crc kubenswrapper[4749]: I0128 19:02:08.885651 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fabdc806-55eb-4e8c-9105-dbbcc33f56ba" path="/var/lib/kubelet/pods/fabdc806-55eb-4e8c-9105-dbbcc33f56ba/volumes" Jan 28 19:02:09 crc kubenswrapper[4749]: I0128 19:02:09.243570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e","Type":"ContainerStarted","Data":"2b480ee76eba7cad902676a522942ce7ded08fab8f0c1f5218c82f1dda7c185a"} Jan 28 19:02:09 crc kubenswrapper[4749]: I0128 19:02:09.243713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e","Type":"ContainerStarted","Data":"3fcf3cc5add5111893c3d87e916fc84d5c5f820fff036b051a182b6d5993742e"} Jan 28 19:02:09 crc kubenswrapper[4749]: I0128 19:02:09.249481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerStarted","Data":"45b778eca206cdc6a4ccea7af927be16c86c3304f62ea93f5f34554a56209a7e"} Jan 28 19:02:09 crc kubenswrapper[4749]: I0128 19:02:09.277232 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.277214501 podStartE2EDuration="2.277214501s" podCreationTimestamp="2026-01-28 19:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:02:09.265000047 +0000 UTC m=+1597.276526842" watchObservedRunningTime="2026-01-28 19:02:09.277214501 +0000 UTC m=+1597.288741266" Jan 28 19:02:10 crc kubenswrapper[4749]: I0128 19:02:10.265153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerStarted","Data":"03f8a37116e1c63a52e8402032e2f6f68797609fa61df5867d201ffa146a1741"} Jan 28 19:02:10 crc kubenswrapper[4749]: I0128 19:02:10.288165 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.424005459 podStartE2EDuration="6.288141896s" podCreationTimestamp="2026-01-28 19:02:04 +0000 UTC" firstStartedPulling="2026-01-28 19:02:05.110621501 +0000 UTC m=+1593.122148276" lastFinishedPulling="2026-01-28 19:02:09.974757948 +0000 UTC m=+1597.986284713" observedRunningTime="2026-01-28 19:02:10.283679214 +0000 UTC m=+1598.295206019" watchObservedRunningTime="2026-01-28 19:02:10.288141896 +0000 UTC m=+1598.299668671" Jan 28 19:02:10 crc kubenswrapper[4749]: I0128 19:02:10.891917 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 19:02:10 crc kubenswrapper[4749]: I0128 19:02:10.892450 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 19:02:11 crc kubenswrapper[4749]: I0128 19:02:11.274217 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 19:02:12 crc kubenswrapper[4749]: I0128 19:02:12.492180 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 19:02:12 crc kubenswrapper[4749]: I0128 19:02:12.492453 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 19:02:12 crc kubenswrapper[4749]: I0128 19:02:12.692515 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 19:02:12 crc kubenswrapper[4749]: I0128 19:02:12.883002 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:02:12 crc kubenswrapper[4749]: E0128 19:02:12.883607 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:02:13 crc kubenswrapper[4749]: I0128 19:02:13.504565 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="31f23a26-7126-4b08-9616-49b0541bff3c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.5:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:02:13 crc kubenswrapper[4749]: I0128 19:02:13.504685 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="31f23a26-7126-4b08-9616-49b0541bff3c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.5:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:02:15 crc kubenswrapper[4749]: I0128 19:02:15.891780 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 19:02:15 crc kubenswrapper[4749]: I0128 19:02:15.892293 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 19:02:16 crc kubenswrapper[4749]: I0128 19:02:16.910483 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8f9fb94c-26de-427c-bd82-919875c80787" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.7:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:02:16 crc kubenswrapper[4749]: I0128 19:02:16.910666 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8f9fb94c-26de-427c-bd82-919875c80787" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.7:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:02:17 crc kubenswrapper[4749]: I0128 19:02:17.693342 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 28 19:02:17 crc kubenswrapper[4749]: I0128 19:02:17.732208 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 28 19:02:18 crc kubenswrapper[4749]: I0128 19:02:18.413590 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 28 19:02:21 crc kubenswrapper[4749]: I0128 19:02:21.068643 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" podUID="c102c3d5-9654-48bf-be4f-6cb41b1f8d7a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:02:21 crc kubenswrapper[4749]: I0128 19:02:21.109583 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rbvsr" podUID="c102c3d5-9654-48bf-be4f-6cb41b1f8d7a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:02:21 crc kubenswrapper[4749]: I0128 19:02:21.636481 4749 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-c6vlq container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.9:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 19:02:21 crc kubenswrapper[4749]: I0128 19:02:21.636803 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-c6vlq" podUID="45c3f22c-a523-4e94-858c-97bdb2705b9e" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.9:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:02:22 crc kubenswrapper[4749]: I0128 19:02:22.501512 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 19:02:22 crc kubenswrapper[4749]: I0128 19:02:22.502593 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 19:02:22 crc kubenswrapper[4749]: I0128 19:02:22.510970 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 19:02:22 crc kubenswrapper[4749]: I0128 19:02:22.515311 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 19:02:23 crc kubenswrapper[4749]: I0128 19:02:23.416851 4749 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 19:02:23 crc kubenswrapper[4749]: I0128 19:02:23.417211 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="cb1f1961-f7f5-4c47-93f9-7f06ac02b45c" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:02:23 crc kubenswrapper[4749]: I0128 19:02:23.425667 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 19:02:23 crc kubenswrapper[4749]: I0128 19:02:23.431954 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 19:02:25 crc kubenswrapper[4749]: I0128 19:02:25.872215 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:02:25 crc kubenswrapper[4749]: E0128 19:02:25.879001 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:02:25 crc kubenswrapper[4749]: I0128 19:02:25.903431 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 19:02:25 crc kubenswrapper[4749]: I0128 19:02:25.905915 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 19:02:25 crc kubenswrapper[4749]: I0128 19:02:25.908081 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 19:02:26 crc kubenswrapper[4749]: I0128 19:02:26.462269 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 19:02:29 crc kubenswrapper[4749]: I0128 19:02:29.488937 4749 generic.go:334] "Generic (PLEG): container finished" podID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerID="84ba5cd8bdc4ee0ca66aa7f29dd240c9df5caa33635d0694f4b9cd7349bdb1f3" exitCode=137 Jan 28 19:02:29 crc kubenswrapper[4749]: I0128 19:02:29.489038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerDied","Data":"84ba5cd8bdc4ee0ca66aa7f29dd240c9df5caa33635d0694f4b9cd7349bdb1f3"} Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.050159 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.247867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsbpq\" (UniqueName: \"kubernetes.io/projected/f786453c-9456-4fdf-99c2-cd432890a8ea-kube-api-access-xsbpq\") pod \"f786453c-9456-4fdf-99c2-cd432890a8ea\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.247991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-scripts\") pod \"f786453c-9456-4fdf-99c2-cd432890a8ea\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.248022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-config-data\") pod \"f786453c-9456-4fdf-99c2-cd432890a8ea\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.248266 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-combined-ca-bundle\") pod \"f786453c-9456-4fdf-99c2-cd432890a8ea\" (UID: \"f786453c-9456-4fdf-99c2-cd432890a8ea\") " Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.254766 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-scripts" (OuterVolumeSpecName: "scripts") pod "f786453c-9456-4fdf-99c2-cd432890a8ea" (UID: "f786453c-9456-4fdf-99c2-cd432890a8ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.270708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f786453c-9456-4fdf-99c2-cd432890a8ea-kube-api-access-xsbpq" (OuterVolumeSpecName: "kube-api-access-xsbpq") pod "f786453c-9456-4fdf-99c2-cd432890a8ea" (UID: "f786453c-9456-4fdf-99c2-cd432890a8ea"). InnerVolumeSpecName "kube-api-access-xsbpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.350859 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.350918 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsbpq\" (UniqueName: \"kubernetes.io/projected/f786453c-9456-4fdf-99c2-cd432890a8ea-kube-api-access-xsbpq\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.382096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-config-data" (OuterVolumeSpecName: "config-data") pod "f786453c-9456-4fdf-99c2-cd432890a8ea" (UID: "f786453c-9456-4fdf-99c2-cd432890a8ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.403511 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f786453c-9456-4fdf-99c2-cd432890a8ea" (UID: "f786453c-9456-4fdf-99c2-cd432890a8ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.453797 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.453839 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f786453c-9456-4fdf-99c2-cd432890a8ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.502102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f786453c-9456-4fdf-99c2-cd432890a8ea","Type":"ContainerDied","Data":"14f52031bdd931bf3218d9c24fa4dba1bd98443d224e0670437ee269cedc4088"} Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.502162 4749 scope.go:117] "RemoveContainer" containerID="84ba5cd8bdc4ee0ca66aa7f29dd240c9df5caa33635d0694f4b9cd7349bdb1f3" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.503315 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.538015 4749 scope.go:117] "RemoveContainer" containerID="b8476bb192297a5845725a9580be68f5c3d7fca1724c9a5daa24e6e43971e741" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.542828 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.559013 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.587865 4749 scope.go:117] "RemoveContainer" containerID="33b6726d0194209eb1aba47cb3dbfa3af65c51f020b2112de1cf35a5816ca589" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.591538 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 28 19:02:30 crc kubenswrapper[4749]: E0128 19:02:30.592310 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-notifier" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.592397 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-notifier" Jan 28 19:02:30 crc kubenswrapper[4749]: E0128 19:02:30.592429 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-evaluator" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.592436 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-evaluator" Jan 28 19:02:30 crc kubenswrapper[4749]: E0128 19:02:30.592468 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-api" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.592475 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-api" Jan 28 19:02:30 crc kubenswrapper[4749]: E0128 19:02:30.592491 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-listener" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.592497 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-listener" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.592748 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-notifier" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.592769 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-evaluator" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.592782 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-listener" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.592808 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" containerName="aodh-api" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.597960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.602222 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.602309 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.602514 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-qbtm4" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.603144 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.606287 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.610776 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.623133 4749 scope.go:117] "RemoveContainer" containerID="eed0f7d56885bcbc54f3390b4e518fe9bc665edeabc64ef7ab7859dbfb4fdca8" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.660891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-config-data\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.660997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2nlx\" (UniqueName: \"kubernetes.io/projected/36a7915e-b865-4fbc-9e00-2a691100f162-kube-api-access-p2nlx\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.661030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-combined-ca-bundle\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.661085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-internal-tls-certs\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.661161 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-scripts\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.661245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-public-tls-certs\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.763605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2nlx\" (UniqueName: \"kubernetes.io/projected/36a7915e-b865-4fbc-9e00-2a691100f162-kube-api-access-p2nlx\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.763654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-combined-ca-bundle\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.763708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-internal-tls-certs\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.763782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-scripts\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.763878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-public-tls-certs\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.763941 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-config-data\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.767623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-scripts\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.768543 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-config-data\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.768946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-internal-tls-certs\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.769006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-combined-ca-bundle\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.770758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a7915e-b865-4fbc-9e00-2a691100f162-public-tls-certs\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.782135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2nlx\" (UniqueName: \"kubernetes.io/projected/36a7915e-b865-4fbc-9e00-2a691100f162-kube-api-access-p2nlx\") pod \"aodh-0\" (UID: \"36a7915e-b865-4fbc-9e00-2a691100f162\") " pod="openstack/aodh-0" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.885941 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f786453c-9456-4fdf-99c2-cd432890a8ea" path="/var/lib/kubelet/pods/f786453c-9456-4fdf-99c2-cd432890a8ea/volumes" Jan 28 19:02:30 crc kubenswrapper[4749]: I0128 19:02:30.925027 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 28 19:02:31 crc kubenswrapper[4749]: I0128 19:02:31.386401 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 28 19:02:31 crc kubenswrapper[4749]: I0128 19:02:31.513315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"36a7915e-b865-4fbc-9e00-2a691100f162","Type":"ContainerStarted","Data":"8c77583966731f78dfef88b001148f6fc7c5e8876f7f7116c0af1b578b6bbd4c"} Jan 28 19:02:32 crc kubenswrapper[4749]: I0128 19:02:32.527349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"36a7915e-b865-4fbc-9e00-2a691100f162","Type":"ContainerStarted","Data":"4cd06bafc7718e1908b4efa0cb21c29b37d1d1dd6b4d68935c7b43a112c4c651"} Jan 28 19:02:33 crc kubenswrapper[4749]: I0128 19:02:33.551992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"36a7915e-b865-4fbc-9e00-2a691100f162","Type":"ContainerStarted","Data":"a5dccb554ef1f88e511484778f6b9a6ac318d214a66c98958fb33fc0a3e78c36"} Jan 28 19:02:34 crc kubenswrapper[4749]: I0128 19:02:34.565001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"36a7915e-b865-4fbc-9e00-2a691100f162","Type":"ContainerStarted","Data":"a91171724bbe9bcb446faa189213ab54c6878bc1e4d9cffa47230c6bee48baca"} Jan 28 19:02:34 crc kubenswrapper[4749]: I0128 19:02:34.598962 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 28 19:02:35 crc kubenswrapper[4749]: I0128 19:02:35.578553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"36a7915e-b865-4fbc-9e00-2a691100f162","Type":"ContainerStarted","Data":"166caf7bbf7a729b60c0bfc5359fa84587eafc5e40aa2ac604937410e6459d98"} Jan 28 19:02:35 crc kubenswrapper[4749]: I0128 19:02:35.606580 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.8521181960000002 podStartE2EDuration="5.606560521s" podCreationTimestamp="2026-01-28 19:02:30 +0000 UTC" firstStartedPulling="2026-01-28 19:02:31.390314115 +0000 UTC m=+1619.401840880" lastFinishedPulling="2026-01-28 19:02:35.14475643 +0000 UTC m=+1623.156283205" observedRunningTime="2026-01-28 19:02:35.598929361 +0000 UTC m=+1623.610456146" watchObservedRunningTime="2026-01-28 19:02:35.606560521 +0000 UTC m=+1623.618087296" Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.093580 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.096050 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c2a507d2-d263-4695-96e3-4f7af4761450" containerName="kube-state-metrics" containerID="cri-o://51ae316fbc39ed75ce1554527e47df195568523c65a0f423144dd87f51d0c08f" gracePeriod=30 Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.264374 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.264648 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="63120686-40c3-4585-9e76-26484d36c17b" containerName="mysqld-exporter" containerID="cri-o://d639b8a3bfe72ee55142babc3e97dabc8b78b59375035dbdb08f0ddef4a8bec4" gracePeriod=30 Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.621465 4749 generic.go:334] "Generic (PLEG): container finished" podID="63120686-40c3-4585-9e76-26484d36c17b" containerID="d639b8a3bfe72ee55142babc3e97dabc8b78b59375035dbdb08f0ddef4a8bec4" exitCode=2 Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.621837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"63120686-40c3-4585-9e76-26484d36c17b","Type":"ContainerDied","Data":"d639b8a3bfe72ee55142babc3e97dabc8b78b59375035dbdb08f0ddef4a8bec4"} Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.624882 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2a507d2-d263-4695-96e3-4f7af4761450" containerID="51ae316fbc39ed75ce1554527e47df195568523c65a0f423144dd87f51d0c08f" exitCode=2 Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.624931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2a507d2-d263-4695-96e3-4f7af4761450","Type":"ContainerDied","Data":"51ae316fbc39ed75ce1554527e47df195568523c65a0f423144dd87f51d0c08f"} Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.817751 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.830402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj2bx\" (UniqueName: \"kubernetes.io/projected/c2a507d2-d263-4695-96e3-4f7af4761450-kube-api-access-jj2bx\") pod \"c2a507d2-d263-4695-96e3-4f7af4761450\" (UID: \"c2a507d2-d263-4695-96e3-4f7af4761450\") " Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.841714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a507d2-d263-4695-96e3-4f7af4761450-kube-api-access-jj2bx" (OuterVolumeSpecName: "kube-api-access-jj2bx") pod "c2a507d2-d263-4695-96e3-4f7af4761450" (UID: "c2a507d2-d263-4695-96e3-4f7af4761450"). InnerVolumeSpecName "kube-api-access-jj2bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.914865 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.934369 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvf56\" (UniqueName: \"kubernetes.io/projected/63120686-40c3-4585-9e76-26484d36c17b-kube-api-access-fvf56\") pod \"63120686-40c3-4585-9e76-26484d36c17b\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.934477 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-config-data\") pod \"63120686-40c3-4585-9e76-26484d36c17b\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.940176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63120686-40c3-4585-9e76-26484d36c17b-kube-api-access-fvf56" (OuterVolumeSpecName: "kube-api-access-fvf56") pod "63120686-40c3-4585-9e76-26484d36c17b" (UID: "63120686-40c3-4585-9e76-26484d36c17b"). InnerVolumeSpecName "kube-api-access-fvf56". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.940409 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-combined-ca-bundle\") pod \"63120686-40c3-4585-9e76-26484d36c17b\" (UID: \"63120686-40c3-4585-9e76-26484d36c17b\") " Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.941775 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvf56\" (UniqueName: \"kubernetes.io/projected/63120686-40c3-4585-9e76-26484d36c17b-kube-api-access-fvf56\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:39 crc kubenswrapper[4749]: I0128 19:02:39.942006 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj2bx\" (UniqueName: \"kubernetes.io/projected/c2a507d2-d263-4695-96e3-4f7af4761450-kube-api-access-jj2bx\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.031997 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63120686-40c3-4585-9e76-26484d36c17b" (UID: "63120686-40c3-4585-9e76-26484d36c17b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.032685 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-config-data" (OuterVolumeSpecName: "config-data") pod "63120686-40c3-4585-9e76-26484d36c17b" (UID: "63120686-40c3-4585-9e76-26484d36c17b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.048248 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.048292 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63120686-40c3-4585-9e76-26484d36c17b-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.635948 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2a507d2-d263-4695-96e3-4f7af4761450","Type":"ContainerDied","Data":"e72752871dc7960ae74be230ed3f750b552f6d40a39e3e888dac6dbc6c346d22"} Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.635996 4749 scope.go:117] "RemoveContainer" containerID="51ae316fbc39ed75ce1554527e47df195568523c65a0f423144dd87f51d0c08f" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.635997 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.637668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"63120686-40c3-4585-9e76-26484d36c17b","Type":"ContainerDied","Data":"79f5f265d51673f545f42fe89f9bdd0e5e301dc2c9acfe0be3c9322e40c06904"} Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.637719 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.675682 4749 scope.go:117] "RemoveContainer" containerID="d639b8a3bfe72ee55142babc3e97dabc8b78b59375035dbdb08f0ddef4a8bec4" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.678880 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.696272 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.712485 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.725230 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.736444 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 19:02:40 crc kubenswrapper[4749]: E0128 19:02:40.736993 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a507d2-d263-4695-96e3-4f7af4761450" containerName="kube-state-metrics" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.737020 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a507d2-d263-4695-96e3-4f7af4761450" containerName="kube-state-metrics" Jan 28 19:02:40 crc kubenswrapper[4749]: E0128 19:02:40.737068 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63120686-40c3-4585-9e76-26484d36c17b" containerName="mysqld-exporter" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.737075 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="63120686-40c3-4585-9e76-26484d36c17b" containerName="mysqld-exporter" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.737284 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="63120686-40c3-4585-9e76-26484d36c17b" containerName="mysqld-exporter" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.737318 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a507d2-d263-4695-96e3-4f7af4761450" containerName="kube-state-metrics" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.738157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.739795 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.740425 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.763515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.763623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-config-data\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.763709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mljrz\" (UniqueName: \"kubernetes.io/projected/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-kube-api-access-mljrz\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.763802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.767307 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.769137 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.771150 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.772657 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.812747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.825715 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.865704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.865850 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.865895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wfw\" (UniqueName: \"kubernetes.io/projected/0229267a-385f-45ad-b285-2bb4be2c328d-kube-api-access-88wfw\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.866032 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mljrz\" (UniqueName: \"kubernetes.io/projected/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-kube-api-access-mljrz\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.866185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.866358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.866460 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.866559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-config-data\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.872201 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:02:40 crc kubenswrapper[4749]: E0128 19:02:40.872912 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.877394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-config-data\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.877425 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.878317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.886800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mljrz\" (UniqueName: \"kubernetes.io/projected/5bd884b8-8914-47d0-b1f7-d85fc200ae9e-kube-api-access-mljrz\") pod \"mysqld-exporter-0\" (UID: \"5bd884b8-8914-47d0-b1f7-d85fc200ae9e\") " pod="openstack/mysqld-exporter-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.887860 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63120686-40c3-4585-9e76-26484d36c17b" path="/var/lib/kubelet/pods/63120686-40c3-4585-9e76-26484d36c17b/volumes" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.888575 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a507d2-d263-4695-96e3-4f7af4761450" path="/var/lib/kubelet/pods/c2a507d2-d263-4695-96e3-4f7af4761450/volumes" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.968963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.969178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.969278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.969359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88wfw\" (UniqueName: \"kubernetes.io/projected/0229267a-385f-45ad-b285-2bb4be2c328d-kube-api-access-88wfw\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.973508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.973849 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.974510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0229267a-385f-45ad-b285-2bb4be2c328d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:40 crc kubenswrapper[4749]: I0128 19:02:40.986574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wfw\" (UniqueName: \"kubernetes.io/projected/0229267a-385f-45ad-b285-2bb4be2c328d-kube-api-access-88wfw\") pod \"kube-state-metrics-0\" (UID: \"0229267a-385f-45ad-b285-2bb4be2c328d\") " pod="openstack/kube-state-metrics-0" Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.062001 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.091819 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.428682 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.429247 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="ceilometer-central-agent" containerID="cri-o://2a1cb1a67ce4979b089a5974362ea41e1f90517846d058aeaff95230cf2787e9" gracePeriod=30 Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.429949 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="proxy-httpd" containerID="cri-o://03f8a37116e1c63a52e8402032e2f6f68797609fa61df5867d201ffa146a1741" gracePeriod=30 Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.430140 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="ceilometer-notification-agent" containerID="cri-o://ceb2aa329c93c31ba7a2594a4472562031874d7fc4bed2092cdb6f04400c4b2e" gracePeriod=30 Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.430166 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="sg-core" containerID="cri-o://45b778eca206cdc6a4ccea7af927be16c86c3304f62ea93f5f34554a56209a7e" gracePeriod=30 Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.665208 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.679648 4749 generic.go:334] "Generic (PLEG): container finished" podID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerID="03f8a37116e1c63a52e8402032e2f6f68797609fa61df5867d201ffa146a1741" exitCode=0 Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.679697 4749 generic.go:334] "Generic (PLEG): container finished" podID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerID="45b778eca206cdc6a4ccea7af927be16c86c3304f62ea93f5f34554a56209a7e" exitCode=2 Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.679753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerDied","Data":"03f8a37116e1c63a52e8402032e2f6f68797609fa61df5867d201ffa146a1741"} Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.679791 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerDied","Data":"45b778eca206cdc6a4ccea7af927be16c86c3304f62ea93f5f34554a56209a7e"} Jan 28 19:02:41 crc kubenswrapper[4749]: I0128 19:02:41.688370 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 19:02:41 crc kubenswrapper[4749]: W0128 19:02:41.711987 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0229267a_385f_45ad_b285_2bb4be2c328d.slice/crio-0014e313ea8a25808787b9724e075b52efe12d495f1b44f311616c1bacb113f3 WatchSource:0}: Error finding container 0014e313ea8a25808787b9724e075b52efe12d495f1b44f311616c1bacb113f3: Status 404 returned error can't find the container with id 0014e313ea8a25808787b9724e075b52efe12d495f1b44f311616c1bacb113f3 Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.702255 4749 generic.go:334] "Generic (PLEG): container finished" podID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerID="2a1cb1a67ce4979b089a5974362ea41e1f90517846d058aeaff95230cf2787e9" exitCode=0 Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.702347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerDied","Data":"2a1cb1a67ce4979b089a5974362ea41e1f90517846d058aeaff95230cf2787e9"} Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.706195 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0229267a-385f-45ad-b285-2bb4be2c328d","Type":"ContainerStarted","Data":"a2ba155f9d2d4a3ca40ae7c7ffc5cd3170ac41cf09c4fc340bd69769c89945a2"} Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.706235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0229267a-385f-45ad-b285-2bb4be2c328d","Type":"ContainerStarted","Data":"0014e313ea8a25808787b9724e075b52efe12d495f1b44f311616c1bacb113f3"} Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.706364 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.709163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"5bd884b8-8914-47d0-b1f7-d85fc200ae9e","Type":"ContainerStarted","Data":"77eb7fb6a6bf5b5057328dd807c9a3c37d831f9df875e06e9991bfa1ef2ff524"} Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.709206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"5bd884b8-8914-47d0-b1f7-d85fc200ae9e","Type":"ContainerStarted","Data":"d59b3fc137b3f94973595d9e42378ad94641f1717bbfc96dcece8d62f1b1fe90"} Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.741806 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.21411493 podStartE2EDuration="2.741788191s" podCreationTimestamp="2026-01-28 19:02:40 +0000 UTC" firstStartedPulling="2026-01-28 19:02:41.717431482 +0000 UTC m=+1629.728958257" lastFinishedPulling="2026-01-28 19:02:42.245104743 +0000 UTC m=+1630.256631518" observedRunningTime="2026-01-28 19:02:42.723864416 +0000 UTC m=+1630.735391201" watchObservedRunningTime="2026-01-28 19:02:42.741788191 +0000 UTC m=+1630.753314976" Jan 28 19:02:42 crc kubenswrapper[4749]: I0128 19:02:42.748408 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.170121067 podStartE2EDuration="2.748383576s" podCreationTimestamp="2026-01-28 19:02:40 +0000 UTC" firstStartedPulling="2026-01-28 19:02:41.690284496 +0000 UTC m=+1629.701811281" lastFinishedPulling="2026-01-28 19:02:42.268547015 +0000 UTC m=+1630.280073790" observedRunningTime="2026-01-28 19:02:42.741593927 +0000 UTC m=+1630.753120722" watchObservedRunningTime="2026-01-28 19:02:42.748383576 +0000 UTC m=+1630.759910371" Jan 28 19:02:43 crc kubenswrapper[4749]: I0128 19:02:43.730612 4749 generic.go:334] "Generic (PLEG): container finished" podID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerID="ceb2aa329c93c31ba7a2594a4472562031874d7fc4bed2092cdb6f04400c4b2e" exitCode=0 Jan 28 19:02:43 crc kubenswrapper[4749]: I0128 19:02:43.730683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerDied","Data":"ceb2aa329c93c31ba7a2594a4472562031874d7fc4bed2092cdb6f04400c4b2e"} Jan 28 19:02:43 crc kubenswrapper[4749]: I0128 19:02:43.946018 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.057909 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-config-data\") pod \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.057990 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-run-httpd\") pod \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.058056 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-log-httpd\") pod \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.058086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-scripts\") pod \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.058119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-combined-ca-bundle\") pod \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.058217 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-sg-core-conf-yaml\") pod \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.058367 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjb28\" (UniqueName: \"kubernetes.io/projected/e4a6a301-409d-4642-a4b6-c15b2fc03c84-kube-api-access-fjb28\") pod \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\" (UID: \"e4a6a301-409d-4642-a4b6-c15b2fc03c84\") " Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.060186 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4a6a301-409d-4642-a4b6-c15b2fc03c84" (UID: "e4a6a301-409d-4642-a4b6-c15b2fc03c84"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.064675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-scripts" (OuterVolumeSpecName: "scripts") pod "e4a6a301-409d-4642-a4b6-c15b2fc03c84" (UID: "e4a6a301-409d-4642-a4b6-c15b2fc03c84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.065262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a6a301-409d-4642-a4b6-c15b2fc03c84-kube-api-access-fjb28" (OuterVolumeSpecName: "kube-api-access-fjb28") pod "e4a6a301-409d-4642-a4b6-c15b2fc03c84" (UID: "e4a6a301-409d-4642-a4b6-c15b2fc03c84"). InnerVolumeSpecName "kube-api-access-fjb28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.069852 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4a6a301-409d-4642-a4b6-c15b2fc03c84" (UID: "e4a6a301-409d-4642-a4b6-c15b2fc03c84"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.092784 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4a6a301-409d-4642-a4b6-c15b2fc03c84" (UID: "e4a6a301-409d-4642-a4b6-c15b2fc03c84"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.161021 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjb28\" (UniqueName: \"kubernetes.io/projected/e4a6a301-409d-4642-a4b6-c15b2fc03c84-kube-api-access-fjb28\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.161304 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.161406 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a301-409d-4642-a4b6-c15b2fc03c84-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.161512 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.161593 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.161621 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a6a301-409d-4642-a4b6-c15b2fc03c84" (UID: "e4a6a301-409d-4642-a4b6-c15b2fc03c84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.213053 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-config-data" (OuterVolumeSpecName: "config-data") pod "e4a6a301-409d-4642-a4b6-c15b2fc03c84" (UID: "e4a6a301-409d-4642-a4b6-c15b2fc03c84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.265829 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.265874 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a301-409d-4642-a4b6-c15b2fc03c84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.743144 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a6a301-409d-4642-a4b6-c15b2fc03c84","Type":"ContainerDied","Data":"1844a219bf8f001899baaf4503ba8ecfd9128d2d7dad790331a4dc0dad7b0fbd"} Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.743406 4749 scope.go:117] "RemoveContainer" containerID="03f8a37116e1c63a52e8402032e2f6f68797609fa61df5867d201ffa146a1741" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.743190 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.779188 4749 scope.go:117] "RemoveContainer" containerID="45b778eca206cdc6a4ccea7af927be16c86c3304f62ea93f5f34554a56209a7e" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.781621 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.799207 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.802450 4749 scope.go:117] "RemoveContainer" containerID="ceb2aa329c93c31ba7a2594a4472562031874d7fc4bed2092cdb6f04400c4b2e" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.814588 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:44 crc kubenswrapper[4749]: E0128 19:02:44.815087 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="sg-core" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.815107 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="sg-core" Jan 28 19:02:44 crc kubenswrapper[4749]: E0128 19:02:44.815117 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="proxy-httpd" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.815123 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="proxy-httpd" Jan 28 19:02:44 crc kubenswrapper[4749]: E0128 19:02:44.815163 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="ceilometer-central-agent" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.815171 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="ceilometer-central-agent" Jan 28 19:02:44 crc kubenswrapper[4749]: E0128 19:02:44.815187 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="ceilometer-notification-agent" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.815193 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="ceilometer-notification-agent" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.815404 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="ceilometer-central-agent" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.815421 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="sg-core" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.815454 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="proxy-httpd" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.815469 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" containerName="ceilometer-notification-agent" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.817725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.820102 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.820223 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.821079 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.831869 4749 scope.go:117] "RemoveContainer" containerID="2a1cb1a67ce4979b089a5974362ea41e1f90517846d058aeaff95230cf2787e9" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.842866 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.879959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.880045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3212c9f3-5620-46b0-bece-ec7ea4b9763a-run-httpd\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.880109 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-scripts\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.880260 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3212c9f3-5620-46b0-bece-ec7ea4b9763a-log-httpd\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.880376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.880442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffx8n\" (UniqueName: \"kubernetes.io/projected/3212c9f3-5620-46b0-bece-ec7ea4b9763a-kube-api-access-ffx8n\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.880471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.880550 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-config-data\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.888075 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a6a301-409d-4642-a4b6-c15b2fc03c84" path="/var/lib/kubelet/pods/e4a6a301-409d-4642-a4b6-c15b2fc03c84/volumes" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.982406 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffx8n\" (UniqueName: \"kubernetes.io/projected/3212c9f3-5620-46b0-bece-ec7ea4b9763a-kube-api-access-ffx8n\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.982480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.982578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-config-data\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.982634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.982686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3212c9f3-5620-46b0-bece-ec7ea4b9763a-run-httpd\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.982705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-scripts\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.982804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3212c9f3-5620-46b0-bece-ec7ea4b9763a-log-httpd\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.982877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.983884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3212c9f3-5620-46b0-bece-ec7ea4b9763a-run-httpd\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.983916 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3212c9f3-5620-46b0-bece-ec7ea4b9763a-log-httpd\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.989002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-config-data\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.989654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.989688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-scripts\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:44 crc kubenswrapper[4749]: I0128 19:02:44.990074 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:45 crc kubenswrapper[4749]: I0128 19:02:45.000210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3212c9f3-5620-46b0-bece-ec7ea4b9763a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:45 crc kubenswrapper[4749]: I0128 19:02:45.002575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffx8n\" (UniqueName: \"kubernetes.io/projected/3212c9f3-5620-46b0-bece-ec7ea4b9763a-kube-api-access-ffx8n\") pod \"ceilometer-0\" (UID: \"3212c9f3-5620-46b0-bece-ec7ea4b9763a\") " pod="openstack/ceilometer-0" Jan 28 19:02:45 crc kubenswrapper[4749]: I0128 19:02:45.153744 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 19:02:45 crc kubenswrapper[4749]: I0128 19:02:45.680643 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 19:02:45 crc kubenswrapper[4749]: I0128 19:02:45.762892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3212c9f3-5620-46b0-bece-ec7ea4b9763a","Type":"ContainerStarted","Data":"912e68c6d6fc04224a009f9a9506bc70d8b59be8260849e45dffb42cbfd65cc5"} Jan 28 19:02:47 crc kubenswrapper[4749]: I0128 19:02:47.781804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3212c9f3-5620-46b0-bece-ec7ea4b9763a","Type":"ContainerStarted","Data":"c04a2294ef3cbbf2d18dfff5d9ddb92464d9f2a0dcc70d61371b82ea2e035d53"} Jan 28 19:02:48 crc kubenswrapper[4749]: I0128 19:02:48.795122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3212c9f3-5620-46b0-bece-ec7ea4b9763a","Type":"ContainerStarted","Data":"7893726a3d442e20ff9ce4b36cd156bd357a8609038e6e8d2b52e6abef5b81bb"} Jan 28 19:02:49 crc kubenswrapper[4749]: I0128 19:02:49.807760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3212c9f3-5620-46b0-bece-ec7ea4b9763a","Type":"ContainerStarted","Data":"b0e9719740d0ec4d7b2212716cc6c50974847f9fabe80ca41170a5c049fe6c09"} Jan 28 19:02:51 crc kubenswrapper[4749]: I0128 19:02:51.104737 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 28 19:02:51 crc kubenswrapper[4749]: I0128 19:02:51.829881 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3212c9f3-5620-46b0-bece-ec7ea4b9763a","Type":"ContainerStarted","Data":"de5b66fc55929a9265682aed5ce6e1e268320d4464fea2d5cf56e7dc12798063"} Jan 28 19:02:51 crc kubenswrapper[4749]: I0128 19:02:51.831573 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 19:02:51 crc kubenswrapper[4749]: I0128 19:02:51.863006 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5329918 podStartE2EDuration="7.862985031s" podCreationTimestamp="2026-01-28 19:02:44 +0000 UTC" firstStartedPulling="2026-01-28 19:02:45.685922692 +0000 UTC m=+1633.697449467" lastFinishedPulling="2026-01-28 19:02:51.015915923 +0000 UTC m=+1639.027442698" observedRunningTime="2026-01-28 19:02:51.862431747 +0000 UTC m=+1639.873958542" watchObservedRunningTime="2026-01-28 19:02:51.862985031 +0000 UTC m=+1639.874511806" Jan 28 19:02:53 crc kubenswrapper[4749]: I0128 19:02:53.871880 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:02:53 crc kubenswrapper[4749]: E0128 19:02:53.872599 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:03:07 crc kubenswrapper[4749]: I0128 19:03:07.873699 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:03:07 crc kubenswrapper[4749]: E0128 19:03:07.874858 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:03:15 crc kubenswrapper[4749]: I0128 19:03:15.173813 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 28 19:03:18 crc kubenswrapper[4749]: I0128 19:03:18.871392 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:03:18 crc kubenswrapper[4749]: E0128 19:03:18.872168 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:03:24 crc kubenswrapper[4749]: I0128 19:03:24.835681 4749 scope.go:117] "RemoveContainer" containerID="844b79c7c29a31700ab2caeb7e1f8d9265a8d8980a97a1502c8a99f6594c1965" Jan 28 19:03:24 crc kubenswrapper[4749]: I0128 19:03:24.864841 4749 scope.go:117] "RemoveContainer" containerID="a0f54d4dc507ebe673b6600c741bb104f6fa42c3ddaa2b238abe945fd456c0fb" Jan 28 19:03:32 crc kubenswrapper[4749]: I0128 19:03:32.883476 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:03:32 crc kubenswrapper[4749]: E0128 19:03:32.886676 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:03:44 crc kubenswrapper[4749]: I0128 19:03:44.872057 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:03:44 crc kubenswrapper[4749]: E0128 19:03:44.872908 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:03:58 crc kubenswrapper[4749]: I0128 19:03:58.871469 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:03:58 crc kubenswrapper[4749]: E0128 19:03:58.872311 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:04:09 crc kubenswrapper[4749]: I0128 19:04:09.871633 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:04:09 crc kubenswrapper[4749]: E0128 19:04:09.872453 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:04:22 crc kubenswrapper[4749]: I0128 19:04:22.879423 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:04:22 crc kubenswrapper[4749]: E0128 19:04:22.880161 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:04:25 crc kubenswrapper[4749]: I0128 19:04:25.087560 4749 scope.go:117] "RemoveContainer" containerID="83f489209ce842339d56ae7a33c76fa63346bd2ac10f3d69ea7678f351754154" Jan 28 19:04:34 crc kubenswrapper[4749]: I0128 19:04:34.871709 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:04:34 crc kubenswrapper[4749]: E0128 19:04:34.873998 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:04:46 crc kubenswrapper[4749]: I0128 19:04:46.871967 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:04:46 crc kubenswrapper[4749]: E0128 19:04:46.874194 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:05:01 crc kubenswrapper[4749]: I0128 19:05:01.871222 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:05:01 crc kubenswrapper[4749]: E0128 19:05:01.871987 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:05:16 crc kubenswrapper[4749]: I0128 19:05:16.872206 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:05:16 crc kubenswrapper[4749]: E0128 19:05:16.873418 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:05:29 crc kubenswrapper[4749]: I0128 19:05:29.871714 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:05:29 crc kubenswrapper[4749]: E0128 19:05:29.872579 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:05:40 crc kubenswrapper[4749]: I0128 19:05:40.871934 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:05:40 crc kubenswrapper[4749]: E0128 19:05:40.873007 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:05:55 crc kubenswrapper[4749]: I0128 19:05:55.872034 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:05:55 crc kubenswrapper[4749]: E0128 19:05:55.872921 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:06:08 crc kubenswrapper[4749]: I0128 19:06:08.872192 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:06:08 crc kubenswrapper[4749]: E0128 19:06:08.872984 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:06:22 crc kubenswrapper[4749]: I0128 19:06:22.880216 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:06:22 crc kubenswrapper[4749]: E0128 19:06:22.880935 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:06:33 crc kubenswrapper[4749]: I0128 19:06:33.072988 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-b604-account-create-update-h4ghn"] Jan 28 19:06:33 crc kubenswrapper[4749]: I0128 19:06:33.088203 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-b604-account-create-update-h4ghn"] Jan 28 19:06:33 crc kubenswrapper[4749]: I0128 19:06:33.098632 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7pqm"] Jan 28 19:06:33 crc kubenswrapper[4749]: I0128 19:06:33.109533 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-g7pqm"] Jan 28 19:06:34 crc kubenswrapper[4749]: I0128 19:06:34.871781 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:06:34 crc kubenswrapper[4749]: I0128 19:06:34.932992 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fda79a3-23ee-45f8-a75b-824d790b8304" path="/var/lib/kubelet/pods/4fda79a3-23ee-45f8-a75b-824d790b8304/volumes" Jan 28 19:06:34 crc kubenswrapper[4749]: I0128 19:06:34.936238 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c146686-53c3-4a99-96fb-db254774ab8f" path="/var/lib/kubelet/pods/6c146686-53c3-4a99-96fb-db254774ab8f/volumes" Jan 28 19:06:35 crc kubenswrapper[4749]: I0128 19:06:35.278220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"964ceb58d124af3d9dc59099eabbe04d1465f5b7f6623d3462a8a468f8af56df"} Jan 28 19:06:37 crc kubenswrapper[4749]: I0128 19:06:37.044933 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-n2jzp"] Jan 28 19:06:37 crc kubenswrapper[4749]: I0128 19:06:37.061011 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-n2jzp"] Jan 28 19:06:37 crc kubenswrapper[4749]: I0128 19:06:37.073067 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e400-account-create-update-fc5xv"] Jan 28 19:06:37 crc kubenswrapper[4749]: I0128 19:06:37.084668 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e400-account-create-update-fc5xv"] Jan 28 19:06:37 crc kubenswrapper[4749]: I0128 19:06:37.097318 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-sczdj"] Jan 28 19:06:37 crc kubenswrapper[4749]: I0128 19:06:37.108885 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-sczdj"] Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.045579 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-80e6-account-create-update-zqbb9"] Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.058153 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0a33-account-create-update-wx8kt"] Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.068781 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jn56l"] Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.079631 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0a33-account-create-update-wx8kt"] Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.091086 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-80e6-account-create-update-zqbb9"] Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.103096 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jn56l"] Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.892115 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef" path="/var/lib/kubelet/pods/5966b5d5-e9e8-4c91-b8cf-8253a8dee0ef/volumes" Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.896830 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93549c28-1d00-40bc-bddc-3e0d93b913f2" path="/var/lib/kubelet/pods/93549c28-1d00-40bc-bddc-3e0d93b913f2/volumes" Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.898812 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f07580-cd44-43b4-a459-69d3984e1c09" path="/var/lib/kubelet/pods/a4f07580-cd44-43b4-a459-69d3984e1c09/volumes" Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.901795 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fc78d3-bd48-48df-b646-a7d44ed0bd3b" path="/var/lib/kubelet/pods/a4fc78d3-bd48-48df-b646-a7d44ed0bd3b/volumes" Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.903314 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af817e85-9839-4aae-afdd-c764fac277a2" path="/var/lib/kubelet/pods/af817e85-9839-4aae-afdd-c764fac277a2/volumes" Jan 28 19:06:38 crc kubenswrapper[4749]: I0128 19:06:38.906927 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46c02d7-38cd-48a6-acd9-dc45c744ce86" path="/var/lib/kubelet/pods/f46c02d7-38cd-48a6-acd9-dc45c744ce86/volumes" Jan 28 19:06:43 crc kubenswrapper[4749]: I0128 19:06:43.034182 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-fc3a-account-create-update-jwst7"] Jan 28 19:06:43 crc kubenswrapper[4749]: I0128 19:06:43.046912 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-znzw7"] Jan 28 19:06:43 crc kubenswrapper[4749]: I0128 19:06:43.058533 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-fc3a-account-create-update-jwst7"] Jan 28 19:06:43 crc kubenswrapper[4749]: I0128 19:06:43.069766 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-znzw7"] Jan 28 19:06:44 crc kubenswrapper[4749]: I0128 19:06:44.913899 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="094de6d7-0ad5-4985-9894-b004751d377b" path="/var/lib/kubelet/pods/094de6d7-0ad5-4985-9894-b004751d377b/volumes" Jan 28 19:06:44 crc kubenswrapper[4749]: I0128 19:06:44.918386 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea422f17-7b59-48ad-8434-557e5c0a6096" path="/var/lib/kubelet/pods/ea422f17-7b59-48ad-8434-557e5c0a6096/volumes" Jan 28 19:06:57 crc kubenswrapper[4749]: I0128 19:06:57.029837 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xkbsr"] Jan 28 19:06:57 crc kubenswrapper[4749]: I0128 19:06:57.042089 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xkbsr"] Jan 28 19:06:58 crc kubenswrapper[4749]: I0128 19:06:58.886002 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d52d0a-0edd-4a1f-a19b-674518ed9e6e" path="/var/lib/kubelet/pods/41d52d0a-0edd-4a1f-a19b-674518ed9e6e/volumes" Jan 28 19:07:05 crc kubenswrapper[4749]: I0128 19:07:05.032361 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vjt8b"] Jan 28 19:07:05 crc kubenswrapper[4749]: I0128 19:07:05.045549 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vjt8b"] Jan 28 19:07:06 crc kubenswrapper[4749]: I0128 19:07:06.886461 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c395eb86-c6bf-4d4b-b0dd-b19cc34c004f" path="/var/lib/kubelet/pods/c395eb86-c6bf-4d4b-b0dd-b19cc34c004f/volumes" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.212518 4749 scope.go:117] "RemoveContainer" containerID="d1b221d137ac5f4864550ffa146b1280923a550d9ba346d8b4cdc6eb65461619" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.244150 4749 scope.go:117] "RemoveContainer" containerID="5016d10624d9a3bea4c1f4a467059ec96f1445afa212377a3287ee0343464475" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.305810 4749 scope.go:117] "RemoveContainer" containerID="cd9352357a3a075908a05182180270262d056cc6d3b4e4c60181b6a14c3e7336" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.363055 4749 scope.go:117] "RemoveContainer" containerID="b4df288d0fcf726ae5713c462e9a8a45c0baddbe762e49efc7dcc6880758d949" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.434856 4749 scope.go:117] "RemoveContainer" containerID="cc5b0e81a030d2a95426ef6701fbc376a7d32b2ed769dbd5cc184f263acf7984" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.482155 4749 scope.go:117] "RemoveContainer" containerID="b0a2e5d15d4e0d8f93a4d664c61b639efd045fdea7c8a5511fce3dee5896d9f4" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.538279 4749 scope.go:117] "RemoveContainer" containerID="a416d2ce979da07dfebefd888927a8f891995745288db76de51ae3127f21542e" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.562725 4749 scope.go:117] "RemoveContainer" containerID="fecc9b27f86173262c43337af2e073e6deccf7a3304227458e32e61c10813aff" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.585502 4749 scope.go:117] "RemoveContainer" containerID="455a9d44004cda7cea0bcfd4442740bfc7de6f54ab80713f3dc85ff239824e7b" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.608268 4749 scope.go:117] "RemoveContainer" containerID="e5331745763b3092bbc7057fbfedf8b2068079d2719742af5e58e3b422448336" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.630828 4749 scope.go:117] "RemoveContainer" containerID="35ff58276818174f3329f6adecd8eb4e223c2c973c27e67b6953911af8dc5661" Jan 28 19:07:25 crc kubenswrapper[4749]: I0128 19:07:25.654111 4749 scope.go:117] "RemoveContainer" containerID="8cba7203c94e757479a1372a56e863942d63e9ad211e57cf733c8c1103757374" Jan 28 19:07:26 crc kubenswrapper[4749]: I0128 19:07:26.043073 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-85s7b"] Jan 28 19:07:26 crc kubenswrapper[4749]: I0128 19:07:26.056192 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d98d-account-create-update-bwvmb"] Jan 28 19:07:26 crc kubenswrapper[4749]: I0128 19:07:26.068565 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d98d-account-create-update-bwvmb"] Jan 28 19:07:26 crc kubenswrapper[4749]: I0128 19:07:26.079036 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-85s7b"] Jan 28 19:07:26 crc kubenswrapper[4749]: I0128 19:07:26.886193 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93832f13-c8d6-4b51-a99a-0a4a3383d7cc" path="/var/lib/kubelet/pods/93832f13-c8d6-4b51-a99a-0a4a3383d7cc/volumes" Jan 28 19:07:26 crc kubenswrapper[4749]: I0128 19:07:26.888521 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b87adb-b9c0-464c-93e8-e7b8528b8b51" path="/var/lib/kubelet/pods/d4b87adb-b9c0-464c-93e8-e7b8528b8b51/volumes" Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.040913 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-668s7"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.055367 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2wdfv"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.070319 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-668s7"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.081720 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-9zdr2"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.092119 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-a2a6-account-create-update-c8ps8"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.101631 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2wdfv"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.110630 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-9zdr2"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.119078 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-a2a6-account-create-update-c8ps8"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.128114 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4db7-account-create-update-d7fz2"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.137496 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74ad-account-create-update-vc5lw"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.149521 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4db7-account-create-update-d7fz2"] Jan 28 19:07:29 crc kubenswrapper[4749]: I0128 19:07:29.159352 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-74ad-account-create-update-vc5lw"] Jan 28 19:07:30 crc kubenswrapper[4749]: I0128 19:07:30.895255 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0042f276-29d6-4d7c-938d-4ab73a8162a5" path="/var/lib/kubelet/pods/0042f276-29d6-4d7c-938d-4ab73a8162a5/volumes" Jan 28 19:07:30 crc kubenswrapper[4749]: I0128 19:07:30.898089 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a45c64-e893-493f-803e-46674d56ec70" path="/var/lib/kubelet/pods/13a45c64-e893-493f-803e-46674d56ec70/volumes" Jan 28 19:07:30 crc kubenswrapper[4749]: I0128 19:07:30.900299 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4465bf8f-73d3-4317-b90e-4f6eac8a59c4" path="/var/lib/kubelet/pods/4465bf8f-73d3-4317-b90e-4f6eac8a59c4/volumes" Jan 28 19:07:30 crc kubenswrapper[4749]: I0128 19:07:30.901487 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f15fcf-5410-4b63-a8c1-b3e7329763ec" path="/var/lib/kubelet/pods/55f15fcf-5410-4b63-a8c1-b3e7329763ec/volumes" Jan 28 19:07:30 crc kubenswrapper[4749]: I0128 19:07:30.903267 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40db02b-0440-4ca9-a0a7-86e18dede584" path="/var/lib/kubelet/pods/c40db02b-0440-4ca9-a0a7-86e18dede584/volumes" Jan 28 19:07:30 crc kubenswrapper[4749]: I0128 19:07:30.904358 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c80fe934-a893-4d6e-9ca1-6df8d12dda6a" path="/var/lib/kubelet/pods/c80fe934-a893-4d6e-9ca1-6df8d12dda6a/volumes" Jan 28 19:07:34 crc kubenswrapper[4749]: I0128 19:07:34.034727 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-l6q85"] Jan 28 19:07:34 crc kubenswrapper[4749]: I0128 19:07:34.049158 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-l6q85"] Jan 28 19:07:34 crc kubenswrapper[4749]: I0128 19:07:34.888658 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9438aa-0957-4b78-8f0f-a12fb94e86b7" path="/var/lib/kubelet/pods/8e9438aa-0957-4b78-8f0f-a12fb94e86b7/volumes" Jan 28 19:08:25 crc kubenswrapper[4749]: I0128 19:08:25.915317 4749 scope.go:117] "RemoveContainer" containerID="5aec11ea2f974d2e68f51a1e7f1188b0fb61ca82c753cb4e70cb721b56c2bda5" Jan 28 19:08:25 crc kubenswrapper[4749]: I0128 19:08:25.941654 4749 scope.go:117] "RemoveContainer" containerID="4e32af488b666924810d28e3e0c7e2000c1b37473ed8d47f32719a69367104f3" Jan 28 19:08:26 crc kubenswrapper[4749]: I0128 19:08:25.999050 4749 scope.go:117] "RemoveContainer" containerID="f4c2e2176cf18698a23209257d01e1598f62e5f1513b2b83cc3c6f63a25591d3" Jan 28 19:08:26 crc kubenswrapper[4749]: I0128 19:08:26.054997 4749 scope.go:117] "RemoveContainer" containerID="d849d1211d477ed41314253092f15dd21b0d6a39d765095000aedab6749f6032" Jan 28 19:08:26 crc kubenswrapper[4749]: I0128 19:08:26.118634 4749 scope.go:117] "RemoveContainer" containerID="9510a383eade54e93a1c4916907ba24ef9118dc4136096d9a2f92f767795d92b" Jan 28 19:08:26 crc kubenswrapper[4749]: I0128 19:08:26.183072 4749 scope.go:117] "RemoveContainer" containerID="57d613d27573952eb2b44fc6f30ea26c59186678a3d0d4ae0d3483d49c996e37" Jan 28 19:08:26 crc kubenswrapper[4749]: I0128 19:08:26.232955 4749 scope.go:117] "RemoveContainer" containerID="c3472472691f8d9c6997cfa00f24c9f20e5ae67194723532afa4dbdfde7f1d01" Jan 28 19:08:26 crc kubenswrapper[4749]: I0128 19:08:26.254812 4749 scope.go:117] "RemoveContainer" containerID="60a8059d263f063b4dd1e964e8fc92e7a692f539e31e1963000aeeb6c32ddf54" Jan 28 19:08:26 crc kubenswrapper[4749]: I0128 19:08:26.276748 4749 scope.go:117] "RemoveContainer" containerID="a765e42e1a21b9f9541f537d44b145537a7be0d3bc5a98e662dffc8f45af5dfd" Jan 28 19:08:49 crc kubenswrapper[4749]: I0128 19:08:49.054275 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bljrg"] Jan 28 19:08:49 crc kubenswrapper[4749]: I0128 19:08:49.079498 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bljrg"] Jan 28 19:08:50 crc kubenswrapper[4749]: I0128 19:08:50.885948 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a76dd69-b64f-47d2-bf48-38731cd1b2d7" path="/var/lib/kubelet/pods/2a76dd69-b64f-47d2-bf48-38731cd1b2d7/volumes" Jan 28 19:08:54 crc kubenswrapper[4749]: I0128 19:08:54.031676 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-z8gws"] Jan 28 19:08:54 crc kubenswrapper[4749]: I0128 19:08:54.041502 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-z8gws"] Jan 28 19:08:54 crc kubenswrapper[4749]: I0128 19:08:54.888799 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22b0cba-1eee-4244-abe1-20e7694a9813" path="/var/lib/kubelet/pods/e22b0cba-1eee-4244-abe1-20e7694a9813/volumes" Jan 28 19:08:57 crc kubenswrapper[4749]: I0128 19:08:57.467756 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:08:57 crc kubenswrapper[4749]: I0128 19:08:57.468097 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:08:59 crc kubenswrapper[4749]: I0128 19:08:59.032549 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-j6wkc"] Jan 28 19:08:59 crc kubenswrapper[4749]: I0128 19:08:59.060581 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-j6wkc"] Jan 28 19:09:00 crc kubenswrapper[4749]: I0128 19:09:00.884079 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9425c299-238f-4293-920f-6ae7ab0c6bb1" path="/var/lib/kubelet/pods/9425c299-238f-4293-920f-6ae7ab0c6bb1/volumes" Jan 28 19:09:03 crc kubenswrapper[4749]: I0128 19:09:03.067477 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-6mtp7"] Jan 28 19:09:03 crc kubenswrapper[4749]: I0128 19:09:03.088069 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-6mtp7"] Jan 28 19:09:04 crc kubenswrapper[4749]: I0128 19:09:04.884870 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb710c26-5706-4ed4-a08d-641315121c9e" path="/var/lib/kubelet/pods/fb710c26-5706-4ed4-a08d-641315121c9e/volumes" Jan 28 19:09:20 crc kubenswrapper[4749]: I0128 19:09:20.054810 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-69wsn"] Jan 28 19:09:20 crc kubenswrapper[4749]: I0128 19:09:20.080199 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-69wsn"] Jan 28 19:09:20 crc kubenswrapper[4749]: I0128 19:09:20.883842 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58bcb1ff-0480-4503-ac70-ce54b6ab6a2d" path="/var/lib/kubelet/pods/58bcb1ff-0480-4503-ac70-ce54b6ab6a2d/volumes" Jan 28 19:09:26 crc kubenswrapper[4749]: I0128 19:09:26.038611 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dhqq7"] Jan 28 19:09:26 crc kubenswrapper[4749]: I0128 19:09:26.048883 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dhqq7"] Jan 28 19:09:26 crc kubenswrapper[4749]: I0128 19:09:26.457372 4749 scope.go:117] "RemoveContainer" containerID="33ecd0b56caf949988ede8b5f2774a38aaf9aba7ae08cc05cf7510833a868ba8" Jan 28 19:09:26 crc kubenswrapper[4749]: I0128 19:09:26.488846 4749 scope.go:117] "RemoveContainer" containerID="4744fe55f05cf7776fdc2efc89c074b120d9340619840321981040e341d474dc" Jan 28 19:09:26 crc kubenswrapper[4749]: I0128 19:09:26.556926 4749 scope.go:117] "RemoveContainer" containerID="5cd99b761b7c639919edb05e077be3b019720251d15ea49760004e79f83885e2" Jan 28 19:09:26 crc kubenswrapper[4749]: I0128 19:09:26.604001 4749 scope.go:117] "RemoveContainer" containerID="96ddcecc6d4371edc7a5fe4a62190b4db4ccfab13b78b9050b57062b5b668a46" Jan 28 19:09:26 crc kubenswrapper[4749]: I0128 19:09:26.676180 4749 scope.go:117] "RemoveContainer" containerID="ef710b593bbf63d8356c3aefdd28e05efcaed121a63ba37ed81d06bda125384f" Jan 28 19:09:26 crc kubenswrapper[4749]: I0128 19:09:26.887015 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b37dae-71dc-4af8-9229-4f7124bcbb16" path="/var/lib/kubelet/pods/e9b37dae-71dc-4af8-9229-4f7124bcbb16/volumes" Jan 28 19:09:27 crc kubenswrapper[4749]: I0128 19:09:27.467543 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:09:27 crc kubenswrapper[4749]: I0128 19:09:27.467603 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:09:57 crc kubenswrapper[4749]: I0128 19:09:57.468824 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:09:57 crc kubenswrapper[4749]: I0128 19:09:57.469354 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:09:57 crc kubenswrapper[4749]: I0128 19:09:57.469401 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:09:57 crc kubenswrapper[4749]: I0128 19:09:57.470246 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"964ceb58d124af3d9dc59099eabbe04d1465f5b7f6623d3462a8a468f8af56df"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:09:57 crc kubenswrapper[4749]: I0128 19:09:57.470292 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://964ceb58d124af3d9dc59099eabbe04d1465f5b7f6623d3462a8a468f8af56df" gracePeriod=600 Jan 28 19:09:58 crc kubenswrapper[4749]: I0128 19:09:58.532748 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="964ceb58d124af3d9dc59099eabbe04d1465f5b7f6623d3462a8a468f8af56df" exitCode=0 Jan 28 19:09:58 crc kubenswrapper[4749]: I0128 19:09:58.532792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"964ceb58d124af3d9dc59099eabbe04d1465f5b7f6623d3462a8a468f8af56df"} Jan 28 19:09:58 crc kubenswrapper[4749]: I0128 19:09:58.533296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184"} Jan 28 19:09:58 crc kubenswrapper[4749]: I0128 19:09:58.533316 4749 scope.go:117] "RemoveContainer" containerID="2d63e2fb3c016f9df61e344325f86c25859b952bbaa78b8ea132d1569fc0a56e" Jan 28 19:10:00 crc kubenswrapper[4749]: I0128 19:10:00.047754 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-68d1-account-create-update-6vdlq"] Jan 28 19:10:00 crc kubenswrapper[4749]: I0128 19:10:00.062501 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-54df-account-create-update-mnv5k"] Jan 28 19:10:00 crc kubenswrapper[4749]: I0128 19:10:00.074285 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-54df-account-create-update-mnv5k"] Jan 28 19:10:00 crc kubenswrapper[4749]: I0128 19:10:00.086792 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-68d1-account-create-update-6vdlq"] Jan 28 19:10:00 crc kubenswrapper[4749]: I0128 19:10:00.884754 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb40a95-9140-4421-8f68-ecc6870d903a" path="/var/lib/kubelet/pods/dfb40a95-9140-4421-8f68-ecc6870d903a/volumes" Jan 28 19:10:00 crc kubenswrapper[4749]: I0128 19:10:00.885510 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa815c9-d525-45d3-a179-dce588ffd65d" path="/var/lib/kubelet/pods/ffa815c9-d525-45d3-a179-dce588ffd65d/volumes" Jan 28 19:10:03 crc kubenswrapper[4749]: I0128 19:10:03.036856 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4k5x8"] Jan 28 19:10:03 crc kubenswrapper[4749]: I0128 19:10:03.050941 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4k5x8"] Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.040915 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6czqc"] Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.051005 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wwgd9"] Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.060961 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-55c2-account-create-update-478qb"] Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.073573 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6czqc"] Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.081451 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wwgd9"] Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.091055 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-55c2-account-create-update-478qb"] Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.887106 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b5eda2-7879-473b-b45d-e5ef3d128fa4" path="/var/lib/kubelet/pods/54b5eda2-7879-473b-b45d-e5ef3d128fa4/volumes" Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.888383 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3182d1-16fe-490b-80d1-b0d3445cf1c8" path="/var/lib/kubelet/pods/5b3182d1-16fe-490b-80d1-b0d3445cf1c8/volumes" Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.889155 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f8e837-ecac-47c7-9a8e-204d9bbbc42b" path="/var/lib/kubelet/pods/e0f8e837-ecac-47c7-9a8e-204d9bbbc42b/volumes" Jan 28 19:10:04 crc kubenswrapper[4749]: I0128 19:10:04.889808 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53d974f-353d-49e8-9347-658cf61ed52b" path="/var/lib/kubelet/pods/f53d974f-353d-49e8-9347-658cf61ed52b/volumes" Jan 28 19:10:26 crc kubenswrapper[4749]: I0128 19:10:26.886552 4749 scope.go:117] "RemoveContainer" containerID="5228904e8bfc7e1c49573772addcd5454552ea07d2fbe373ba8c93d6be1c6dee" Jan 28 19:10:26 crc kubenswrapper[4749]: I0128 19:10:26.914852 4749 scope.go:117] "RemoveContainer" containerID="2923f29d0cd47940ac2825943fe1a3e3b9b5e3d05ed6e9bf51ed9dda4463b1e7" Jan 28 19:10:26 crc kubenswrapper[4749]: I0128 19:10:26.990869 4749 scope.go:117] "RemoveContainer" containerID="aaef5ee0a92c08baea278b3664f232d458db46fe8ef4ec77605633d19da1d631" Jan 28 19:10:27 crc kubenswrapper[4749]: I0128 19:10:27.035972 4749 scope.go:117] "RemoveContainer" containerID="d8413800bacc7001e486a90301643948645e44db627651c5e1d9f1588ec6c17c" Jan 28 19:10:27 crc kubenswrapper[4749]: I0128 19:10:27.106917 4749 scope.go:117] "RemoveContainer" containerID="e65783790d4933c836c67b35bd2edce8ffbf949b50c09754fc13674b22b8253e" Jan 28 19:10:27 crc kubenswrapper[4749]: I0128 19:10:27.177458 4749 scope.go:117] "RemoveContainer" containerID="d07a18bba8c99d44c5c8e3774f5f83558a74c9473d2a3c63dd4b63e688fcaba1" Jan 28 19:10:27 crc kubenswrapper[4749]: I0128 19:10:27.223316 4749 scope.go:117] "RemoveContainer" containerID="ff087e09a40e738faed05bb60d542d39ad374e0b9eb49c055e6252ae1ec69cfd" Jan 28 19:10:27 crc kubenswrapper[4749]: I0128 19:10:27.997198 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-prxfb"] Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.002209 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.011971 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prxfb"] Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.095123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-catalog-content\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.095213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-utilities\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.095249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkbvs\" (UniqueName: \"kubernetes.io/projected/41c9c2d9-743f-4b6c-98e6-2de980249549-kube-api-access-rkbvs\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.197400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-catalog-content\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.197488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-utilities\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.197516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkbvs\" (UniqueName: \"kubernetes.io/projected/41c9c2d9-743f-4b6c-98e6-2de980249549-kube-api-access-rkbvs\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.197936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-catalog-content\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.197936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-utilities\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.198721 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wsnmq"] Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.201687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.223922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkbvs\" (UniqueName: \"kubernetes.io/projected/41c9c2d9-743f-4b6c-98e6-2de980249549-kube-api-access-rkbvs\") pod \"redhat-operators-prxfb\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.239162 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wsnmq"] Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.301315 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-catalog-content\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.301596 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-utilities\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.301691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55dtf\" (UniqueName: \"kubernetes.io/projected/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-kube-api-access-55dtf\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.346832 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.404172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-catalog-content\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.404241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-utilities\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.404370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55dtf\" (UniqueName: \"kubernetes.io/projected/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-kube-api-access-55dtf\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.404975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-catalog-content\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.405071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-utilities\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.421941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55dtf\" (UniqueName: \"kubernetes.io/projected/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-kube-api-access-55dtf\") pod \"community-operators-wsnmq\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:28 crc kubenswrapper[4749]: I0128 19:10:28.586676 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.089026 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prxfb"] Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.242755 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wsnmq"] Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.862288 4749 generic.go:334] "Generic (PLEG): container finished" podID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerID="0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b" exitCode=0 Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.862415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsnmq" event={"ID":"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c","Type":"ContainerDied","Data":"0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b"} Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.862929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsnmq" event={"ID":"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c","Type":"ContainerStarted","Data":"39c485ae4eefe7f60c33896e8fc5df1e295d0b0f2317365056ca286f8a40c339"} Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.864939 4749 generic.go:334] "Generic (PLEG): container finished" podID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerID="ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae" exitCode=0 Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.865019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prxfb" event={"ID":"41c9c2d9-743f-4b6c-98e6-2de980249549","Type":"ContainerDied","Data":"ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae"} Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.865055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prxfb" event={"ID":"41c9c2d9-743f-4b6c-98e6-2de980249549","Type":"ContainerStarted","Data":"a8cf12351d1bf02e9512a68e661b27ab12907fbbe81ac5537b51165d0baa814f"} Jan 28 19:10:29 crc kubenswrapper[4749]: I0128 19:10:29.865100 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.392623 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz29"] Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.395169 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.403226 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz29"] Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.466429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-catalog-content\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.466601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjxtq\" (UniqueName: \"kubernetes.io/projected/90030828-f267-4c73-843b-b2cc161437ce-kube-api-access-jjxtq\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.466677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-utilities\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.572544 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-utilities\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.573040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-catalog-content\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.573091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-utilities\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.573244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjxtq\" (UniqueName: \"kubernetes.io/projected/90030828-f267-4c73-843b-b2cc161437ce-kube-api-access-jjxtq\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.573495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-catalog-content\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.602538 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xdb5g"] Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.605386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjxtq\" (UniqueName: \"kubernetes.io/projected/90030828-f267-4c73-843b-b2cc161437ce-kube-api-access-jjxtq\") pod \"redhat-marketplace-ttz29\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.607306 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.616136 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdb5g"] Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.678014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-utilities\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.678297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbbtc\" (UniqueName: \"kubernetes.io/projected/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-kube-api-access-rbbtc\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.678394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-catalog-content\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.715064 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.781513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-utilities\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.781632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbbtc\" (UniqueName: \"kubernetes.io/projected/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-kube-api-access-rbbtc\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.781678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-catalog-content\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.783943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-catalog-content\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.784010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-utilities\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.799963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbbtc\" (UniqueName: \"kubernetes.io/projected/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-kube-api-access-rbbtc\") pod \"certified-operators-xdb5g\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.910277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prxfb" event={"ID":"41c9c2d9-743f-4b6c-98e6-2de980249549","Type":"ContainerStarted","Data":"b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38"} Jan 28 19:10:30 crc kubenswrapper[4749]: I0128 19:10:30.911869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.415716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz29"] Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.474861 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdb5g"] Jan 28 19:10:31 crc kubenswrapper[4749]: W0128 19:10:31.476869 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f0abbf8_877d_4e64_9261_2bb3b9c38be3.slice/crio-ecc67289cb03af253f205f585c902d67c39aa5548d71861065b2ac60105f30c9 WatchSource:0}: Error finding container ecc67289cb03af253f205f585c902d67c39aa5548d71861065b2ac60105f30c9: Status 404 returned error can't find the container with id ecc67289cb03af253f205f585c902d67c39aa5548d71861065b2ac60105f30c9 Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.910106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsnmq" event={"ID":"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c","Type":"ContainerStarted","Data":"2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a"} Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.912250 4749 generic.go:334] "Generic (PLEG): container finished" podID="90030828-f267-4c73-843b-b2cc161437ce" containerID="7a69fd8a9ba21f4203f9323bd979df0a95c25a619aaa2f45c96d2332bdef43ff" exitCode=0 Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.912319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz29" event={"ID":"90030828-f267-4c73-843b-b2cc161437ce","Type":"ContainerDied","Data":"7a69fd8a9ba21f4203f9323bd979df0a95c25a619aaa2f45c96d2332bdef43ff"} Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.912365 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz29" event={"ID":"90030828-f267-4c73-843b-b2cc161437ce","Type":"ContainerStarted","Data":"b3e8bd2f54bef0292bc1390381e15c7c42241ea1710aa140a21920f6c95fb96f"} Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.919678 4749 generic.go:334] "Generic (PLEG): container finished" podID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerID="65c397716cc19e7924dca39166886666b06870dd87b9d0fb329fa42d7a372057" exitCode=0 Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.920497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdb5g" event={"ID":"0f0abbf8-877d-4e64-9261-2bb3b9c38be3","Type":"ContainerDied","Data":"65c397716cc19e7924dca39166886666b06870dd87b9d0fb329fa42d7a372057"} Jan 28 19:10:31 crc kubenswrapper[4749]: I0128 19:10:31.920558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdb5g" event={"ID":"0f0abbf8-877d-4e64-9261-2bb3b9c38be3","Type":"ContainerStarted","Data":"ecc67289cb03af253f205f585c902d67c39aa5548d71861065b2ac60105f30c9"} Jan 28 19:10:33 crc kubenswrapper[4749]: I0128 19:10:33.940238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz29" event={"ID":"90030828-f267-4c73-843b-b2cc161437ce","Type":"ContainerStarted","Data":"44cc35ad966217ba1846b545827429425271f5581df3b36dcfa72f5567244f51"} Jan 28 19:10:33 crc kubenswrapper[4749]: I0128 19:10:33.943004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdb5g" event={"ID":"0f0abbf8-877d-4e64-9261-2bb3b9c38be3","Type":"ContainerStarted","Data":"e34b045802a33cd62cbf743d9d31ec476b79200281c3974a3d7e2f7b25e4f174"} Jan 28 19:10:34 crc kubenswrapper[4749]: I0128 19:10:34.953469 4749 generic.go:334] "Generic (PLEG): container finished" podID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerID="2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a" exitCode=0 Jan 28 19:10:34 crc kubenswrapper[4749]: I0128 19:10:34.953564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsnmq" event={"ID":"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c","Type":"ContainerDied","Data":"2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a"} Jan 28 19:10:35 crc kubenswrapper[4749]: I0128 19:10:35.972550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsnmq" event={"ID":"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c","Type":"ContainerStarted","Data":"d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65"} Jan 28 19:10:35 crc kubenswrapper[4749]: I0128 19:10:35.998734 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wsnmq" podStartSLOduration=2.37354383 podStartE2EDuration="7.998714361s" podCreationTimestamp="2026-01-28 19:10:28 +0000 UTC" firstStartedPulling="2026-01-28 19:10:29.864718334 +0000 UTC m=+2097.876245109" lastFinishedPulling="2026-01-28 19:10:35.489888865 +0000 UTC m=+2103.501415640" observedRunningTime="2026-01-28 19:10:35.990599742 +0000 UTC m=+2104.002126527" watchObservedRunningTime="2026-01-28 19:10:35.998714361 +0000 UTC m=+2104.010241136" Jan 28 19:10:36 crc kubenswrapper[4749]: I0128 19:10:36.984858 4749 generic.go:334] "Generic (PLEG): container finished" podID="90030828-f267-4c73-843b-b2cc161437ce" containerID="44cc35ad966217ba1846b545827429425271f5581df3b36dcfa72f5567244f51" exitCode=0 Jan 28 19:10:36 crc kubenswrapper[4749]: I0128 19:10:36.984938 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz29" event={"ID":"90030828-f267-4c73-843b-b2cc161437ce","Type":"ContainerDied","Data":"44cc35ad966217ba1846b545827429425271f5581df3b36dcfa72f5567244f51"} Jan 28 19:10:37 crc kubenswrapper[4749]: I0128 19:10:37.996642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz29" event={"ID":"90030828-f267-4c73-843b-b2cc161437ce","Type":"ContainerStarted","Data":"22ba2cd46ae52a215807f18e8a93072a412a4cf8de274c5563e748e1ec3fe562"} Jan 28 19:10:38 crc kubenswrapper[4749]: I0128 19:10:38.587585 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:38 crc kubenswrapper[4749]: I0128 19:10:38.587642 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:10:39 crc kubenswrapper[4749]: I0128 19:10:39.009319 4749 generic.go:334] "Generic (PLEG): container finished" podID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerID="e34b045802a33cd62cbf743d9d31ec476b79200281c3974a3d7e2f7b25e4f174" exitCode=0 Jan 28 19:10:39 crc kubenswrapper[4749]: I0128 19:10:39.009429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdb5g" event={"ID":"0f0abbf8-877d-4e64-9261-2bb3b9c38be3","Type":"ContainerDied","Data":"e34b045802a33cd62cbf743d9d31ec476b79200281c3974a3d7e2f7b25e4f174"} Jan 28 19:10:39 crc kubenswrapper[4749]: I0128 19:10:39.064057 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ttz29" podStartSLOduration=3.372971498 podStartE2EDuration="9.064038668s" podCreationTimestamp="2026-01-28 19:10:30 +0000 UTC" firstStartedPulling="2026-01-28 19:10:31.915082626 +0000 UTC m=+2099.926609401" lastFinishedPulling="2026-01-28 19:10:37.606149796 +0000 UTC m=+2105.617676571" observedRunningTime="2026-01-28 19:10:39.055311174 +0000 UTC m=+2107.066837959" watchObservedRunningTime="2026-01-28 19:10:39.064038668 +0000 UTC m=+2107.075565443" Jan 28 19:10:39 crc kubenswrapper[4749]: I0128 19:10:39.640963 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wsnmq" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="registry-server" probeResult="failure" output=< Jan 28 19:10:39 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:10:39 crc kubenswrapper[4749]: > Jan 28 19:10:40 crc kubenswrapper[4749]: I0128 19:10:40.715556 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:40 crc kubenswrapper[4749]: I0128 19:10:40.717202 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:10:41 crc kubenswrapper[4749]: I0128 19:10:41.031011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdb5g" event={"ID":"0f0abbf8-877d-4e64-9261-2bb3b9c38be3","Type":"ContainerStarted","Data":"6c7cca4fca6a81ae8b821f62bce1821deb3589c7fc8a5b4b0642154f86d04a9f"} Jan 28 19:10:41 crc kubenswrapper[4749]: I0128 19:10:41.061992 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xdb5g" podStartSLOduration=3.465667455 podStartE2EDuration="11.061974743s" podCreationTimestamp="2026-01-28 19:10:30 +0000 UTC" firstStartedPulling="2026-01-28 19:10:31.921862202 +0000 UTC m=+2099.933388977" lastFinishedPulling="2026-01-28 19:10:39.51816949 +0000 UTC m=+2107.529696265" observedRunningTime="2026-01-28 19:10:41.056381406 +0000 UTC m=+2109.067908191" watchObservedRunningTime="2026-01-28 19:10:41.061974743 +0000 UTC m=+2109.073501508" Jan 28 19:10:41 crc kubenswrapper[4749]: I0128 19:10:41.807876 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ttz29" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="registry-server" probeResult="failure" output=< Jan 28 19:10:41 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:10:41 crc kubenswrapper[4749]: > Jan 28 19:10:42 crc kubenswrapper[4749]: I0128 19:10:42.044766 4749 generic.go:334] "Generic (PLEG): container finished" podID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerID="b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38" exitCode=0 Jan 28 19:10:42 crc kubenswrapper[4749]: I0128 19:10:42.044819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prxfb" event={"ID":"41c9c2d9-743f-4b6c-98e6-2de980249549","Type":"ContainerDied","Data":"b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38"} Jan 28 19:10:44 crc kubenswrapper[4749]: I0128 19:10:44.068437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prxfb" event={"ID":"41c9c2d9-743f-4b6c-98e6-2de980249549","Type":"ContainerStarted","Data":"1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6"} Jan 28 19:10:44 crc kubenswrapper[4749]: I0128 19:10:44.097150 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-prxfb" podStartSLOduration=3.606525641 podStartE2EDuration="17.09712724s" podCreationTimestamp="2026-01-28 19:10:27 +0000 UTC" firstStartedPulling="2026-01-28 19:10:29.866783974 +0000 UTC m=+2097.878310759" lastFinishedPulling="2026-01-28 19:10:43.357385583 +0000 UTC m=+2111.368912358" observedRunningTime="2026-01-28 19:10:44.087698308 +0000 UTC m=+2112.099225103" watchObservedRunningTime="2026-01-28 19:10:44.09712724 +0000 UTC m=+2112.108654015" Jan 28 19:10:48 crc kubenswrapper[4749]: I0128 19:10:48.053800 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pnwds"] Jan 28 19:10:48 crc kubenswrapper[4749]: I0128 19:10:48.066226 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pnwds"] Jan 28 19:10:48 crc kubenswrapper[4749]: I0128 19:10:48.347151 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:48 crc kubenswrapper[4749]: I0128 19:10:48.347205 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:10:48 crc kubenswrapper[4749]: I0128 19:10:48.885723 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d68f84a9-069c-4b25-a939-cd98ba9ab12b" path="/var/lib/kubelet/pods/d68f84a9-069c-4b25-a939-cd98ba9ab12b/volumes" Jan 28 19:10:49 crc kubenswrapper[4749]: I0128 19:10:49.403672 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prxfb" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="registry-server" probeResult="failure" output=< Jan 28 19:10:49 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:10:49 crc kubenswrapper[4749]: > Jan 28 19:10:49 crc kubenswrapper[4749]: I0128 19:10:49.649115 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wsnmq" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="registry-server" probeResult="failure" output=< Jan 28 19:10:49 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:10:49 crc kubenswrapper[4749]: > Jan 28 19:10:50 crc kubenswrapper[4749]: I0128 19:10:50.912828 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:50 crc kubenswrapper[4749]: I0128 19:10:50.912874 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:10:51 crc kubenswrapper[4749]: I0128 19:10:51.776467 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ttz29" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="registry-server" probeResult="failure" output=< Jan 28 19:10:51 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:10:51 crc kubenswrapper[4749]: > Jan 28 19:10:51 crc kubenswrapper[4749]: I0128 19:10:51.966949 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xdb5g" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="registry-server" probeResult="failure" output=< Jan 28 19:10:51 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:10:51 crc kubenswrapper[4749]: > Jan 28 19:10:59 crc kubenswrapper[4749]: I0128 19:10:59.394941 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prxfb" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="registry-server" probeResult="failure" output=< Jan 28 19:10:59 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:10:59 crc kubenswrapper[4749]: > Jan 28 19:10:59 crc kubenswrapper[4749]: I0128 19:10:59.637837 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wsnmq" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="registry-server" probeResult="failure" output=< Jan 28 19:10:59 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:10:59 crc kubenswrapper[4749]: > Jan 28 19:11:00 crc kubenswrapper[4749]: I0128 19:11:00.770115 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:11:00 crc kubenswrapper[4749]: I0128 19:11:00.828023 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:11:01 crc kubenswrapper[4749]: I0128 19:11:01.594834 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz29"] Jan 28 19:11:01 crc kubenswrapper[4749]: I0128 19:11:01.970896 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xdb5g" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="registry-server" probeResult="failure" output=< Jan 28 19:11:01 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:11:01 crc kubenswrapper[4749]: > Jan 28 19:11:02 crc kubenswrapper[4749]: I0128 19:11:02.281093 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ttz29" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="registry-server" containerID="cri-o://22ba2cd46ae52a215807f18e8a93072a412a4cf8de274c5563e748e1ec3fe562" gracePeriod=2 Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.294128 4749 generic.go:334] "Generic (PLEG): container finished" podID="90030828-f267-4c73-843b-b2cc161437ce" containerID="22ba2cd46ae52a215807f18e8a93072a412a4cf8de274c5563e748e1ec3fe562" exitCode=0 Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.294228 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz29" event={"ID":"90030828-f267-4c73-843b-b2cc161437ce","Type":"ContainerDied","Data":"22ba2cd46ae52a215807f18e8a93072a412a4cf8de274c5563e748e1ec3fe562"} Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.294786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz29" event={"ID":"90030828-f267-4c73-843b-b2cc161437ce","Type":"ContainerDied","Data":"b3e8bd2f54bef0292bc1390381e15c7c42241ea1710aa140a21920f6c95fb96f"} Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.294805 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3e8bd2f54bef0292bc1390381e15c7c42241ea1710aa140a21920f6c95fb96f" Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.336285 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.401576 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-utilities\") pod \"90030828-f267-4c73-843b-b2cc161437ce\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.401619 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-catalog-content\") pod \"90030828-f267-4c73-843b-b2cc161437ce\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.401658 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjxtq\" (UniqueName: \"kubernetes.io/projected/90030828-f267-4c73-843b-b2cc161437ce-kube-api-access-jjxtq\") pod \"90030828-f267-4c73-843b-b2cc161437ce\" (UID: \"90030828-f267-4c73-843b-b2cc161437ce\") " Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.402829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-utilities" (OuterVolumeSpecName: "utilities") pod "90030828-f267-4c73-843b-b2cc161437ce" (UID: "90030828-f267-4c73-843b-b2cc161437ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.410923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90030828-f267-4c73-843b-b2cc161437ce-kube-api-access-jjxtq" (OuterVolumeSpecName: "kube-api-access-jjxtq") pod "90030828-f267-4c73-843b-b2cc161437ce" (UID: "90030828-f267-4c73-843b-b2cc161437ce"). InnerVolumeSpecName "kube-api-access-jjxtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.436137 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90030828-f267-4c73-843b-b2cc161437ce" (UID: "90030828-f267-4c73-843b-b2cc161437ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.504455 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.504683 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90030828-f267-4c73-843b-b2cc161437ce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:03 crc kubenswrapper[4749]: I0128 19:11:03.504696 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjxtq\" (UniqueName: \"kubernetes.io/projected/90030828-f267-4c73-843b-b2cc161437ce-kube-api-access-jjxtq\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:04 crc kubenswrapper[4749]: I0128 19:11:04.308466 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttz29" Jan 28 19:11:04 crc kubenswrapper[4749]: I0128 19:11:04.355370 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz29"] Jan 28 19:11:04 crc kubenswrapper[4749]: I0128 19:11:04.365535 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz29"] Jan 28 19:11:04 crc kubenswrapper[4749]: I0128 19:11:04.886709 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90030828-f267-4c73-843b-b2cc161437ce" path="/var/lib/kubelet/pods/90030828-f267-4c73-843b-b2cc161437ce/volumes" Jan 28 19:11:08 crc kubenswrapper[4749]: I0128 19:11:08.642622 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:11:08 crc kubenswrapper[4749]: I0128 19:11:08.699918 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:11:08 crc kubenswrapper[4749]: I0128 19:11:08.889575 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wsnmq"] Jan 28 19:11:09 crc kubenswrapper[4749]: I0128 19:11:09.394251 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prxfb" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="registry-server" probeResult="failure" output=< Jan 28 19:11:09 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:11:09 crc kubenswrapper[4749]: > Jan 28 19:11:10 crc kubenswrapper[4749]: I0128 19:11:10.379497 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wsnmq" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="registry-server" containerID="cri-o://d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65" gracePeriod=2 Jan 28 19:11:10 crc kubenswrapper[4749]: I0128 19:11:10.859378 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:11:10 crc kubenswrapper[4749]: I0128 19:11:10.984972 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:11:10 crc kubenswrapper[4749]: I0128 19:11:10.998364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-utilities\") pod \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " Jan 28 19:11:10 crc kubenswrapper[4749]: I0128 19:11:10.998517 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-catalog-content\") pod \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " Jan 28 19:11:10 crc kubenswrapper[4749]: I0128 19:11:10.998700 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55dtf\" (UniqueName: \"kubernetes.io/projected/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-kube-api-access-55dtf\") pod \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\" (UID: \"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c\") " Jan 28 19:11:10 crc kubenswrapper[4749]: I0128 19:11:10.999656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-utilities" (OuterVolumeSpecName: "utilities") pod "030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" (UID: "030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.012833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-kube-api-access-55dtf" (OuterVolumeSpecName: "kube-api-access-55dtf") pod "030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" (UID: "030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c"). InnerVolumeSpecName "kube-api-access-55dtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.041456 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.083441 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" (UID: "030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.101831 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55dtf\" (UniqueName: \"kubernetes.io/projected/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-kube-api-access-55dtf\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.101863 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.101872 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.391071 4749 generic.go:334] "Generic (PLEG): container finished" podID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerID="d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65" exitCode=0 Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.391214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsnmq" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.391266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsnmq" event={"ID":"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c","Type":"ContainerDied","Data":"d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65"} Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.391301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsnmq" event={"ID":"030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c","Type":"ContainerDied","Data":"39c485ae4eefe7f60c33896e8fc5df1e295d0b0f2317365056ca286f8a40c339"} Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.391343 4749 scope.go:117] "RemoveContainer" containerID="d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.412225 4749 scope.go:117] "RemoveContainer" containerID="2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.433778 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wsnmq"] Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.457082 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wsnmq"] Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.466733 4749 scope.go:117] "RemoveContainer" containerID="0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.515201 4749 scope.go:117] "RemoveContainer" containerID="d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65" Jan 28 19:11:11 crc kubenswrapper[4749]: E0128 19:11:11.515618 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65\": container with ID starting with d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65 not found: ID does not exist" containerID="d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.515650 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65"} err="failed to get container status \"d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65\": rpc error: code = NotFound desc = could not find container \"d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65\": container with ID starting with d9e447989a594ed08e3f8b315bfbf2f97b26783711d0c3a79b4ba838b9654d65 not found: ID does not exist" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.515677 4749 scope.go:117] "RemoveContainer" containerID="2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a" Jan 28 19:11:11 crc kubenswrapper[4749]: E0128 19:11:11.515959 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a\": container with ID starting with 2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a not found: ID does not exist" containerID="2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.515980 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a"} err="failed to get container status \"2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a\": rpc error: code = NotFound desc = could not find container \"2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a\": container with ID starting with 2855917d9b4a8c5ba6bab5a85082ef522473172060140f352d454e8f425bc81a not found: ID does not exist" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.515998 4749 scope.go:117] "RemoveContainer" containerID="0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b" Jan 28 19:11:11 crc kubenswrapper[4749]: E0128 19:11:11.516241 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b\": container with ID starting with 0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b not found: ID does not exist" containerID="0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b" Jan 28 19:11:11 crc kubenswrapper[4749]: I0128 19:11:11.516267 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b"} err="failed to get container status \"0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b\": rpc error: code = NotFound desc = could not find container \"0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b\": container with ID starting with 0b42e9b4e1fdb65b59aa26de50a808dc4389190b7eec041074623fa843f1997b not found: ID does not exist" Jan 28 19:11:12 crc kubenswrapper[4749]: I0128 19:11:12.885159 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" path="/var/lib/kubelet/pods/030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c/volumes" Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.090860 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdb5g"] Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.091105 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xdb5g" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="registry-server" containerID="cri-o://6c7cca4fca6a81ae8b821f62bce1821deb3589c7fc8a5b4b0642154f86d04a9f" gracePeriod=2 Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.417296 4749 generic.go:334] "Generic (PLEG): container finished" podID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerID="6c7cca4fca6a81ae8b821f62bce1821deb3589c7fc8a5b4b0642154f86d04a9f" exitCode=0 Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.417347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdb5g" event={"ID":"0f0abbf8-877d-4e64-9261-2bb3b9c38be3","Type":"ContainerDied","Data":"6c7cca4fca6a81ae8b821f62bce1821deb3589c7fc8a5b4b0642154f86d04a9f"} Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.618214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.657775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-utilities\") pod \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.657827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-catalog-content\") pod \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.657930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbbtc\" (UniqueName: \"kubernetes.io/projected/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-kube-api-access-rbbtc\") pod \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\" (UID: \"0f0abbf8-877d-4e64-9261-2bb3b9c38be3\") " Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.658680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-utilities" (OuterVolumeSpecName: "utilities") pod "0f0abbf8-877d-4e64-9261-2bb3b9c38be3" (UID: "0f0abbf8-877d-4e64-9261-2bb3b9c38be3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.663051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-kube-api-access-rbbtc" (OuterVolumeSpecName: "kube-api-access-rbbtc") pod "0f0abbf8-877d-4e64-9261-2bb3b9c38be3" (UID: "0f0abbf8-877d-4e64-9261-2bb3b9c38be3"). InnerVolumeSpecName "kube-api-access-rbbtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.712404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f0abbf8-877d-4e64-9261-2bb3b9c38be3" (UID: "0f0abbf8-877d-4e64-9261-2bb3b9c38be3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.761265 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbbtc\" (UniqueName: \"kubernetes.io/projected/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-kube-api-access-rbbtc\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.761306 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:13 crc kubenswrapper[4749]: I0128 19:11:13.761320 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0abbf8-877d-4e64-9261-2bb3b9c38be3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.043781 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dhv8b"] Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.055288 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dhv8b"] Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.431055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdb5g" event={"ID":"0f0abbf8-877d-4e64-9261-2bb3b9c38be3","Type":"ContainerDied","Data":"ecc67289cb03af253f205f585c902d67c39aa5548d71861065b2ac60105f30c9"} Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.431120 4749 scope.go:117] "RemoveContainer" containerID="6c7cca4fca6a81ae8b821f62bce1821deb3589c7fc8a5b4b0642154f86d04a9f" Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.431137 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdb5g" Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.453468 4749 scope.go:117] "RemoveContainer" containerID="e34b045802a33cd62cbf743d9d31ec476b79200281c3974a3d7e2f7b25e4f174" Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.477233 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdb5g"] Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.487564 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xdb5g"] Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.498702 4749 scope.go:117] "RemoveContainer" containerID="65c397716cc19e7924dca39166886666b06870dd87b9d0fb329fa42d7a372057" Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.884922 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" path="/var/lib/kubelet/pods/0f0abbf8-877d-4e64-9261-2bb3b9c38be3/volumes" Jan 28 19:11:14 crc kubenswrapper[4749]: I0128 19:11:14.885933 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75bbcdb-3953-4dbd-b80f-e3c487dc61fe" path="/var/lib/kubelet/pods/e75bbcdb-3953-4dbd-b80f-e3c487dc61fe/volumes" Jan 28 19:11:17 crc kubenswrapper[4749]: I0128 19:11:17.033483 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwj5s"] Jan 28 19:11:17 crc kubenswrapper[4749]: I0128 19:11:17.046787 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwj5s"] Jan 28 19:11:18 crc kubenswrapper[4749]: I0128 19:11:18.030489 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-jsdx8"] Jan 28 19:11:18 crc kubenswrapper[4749]: I0128 19:11:18.042525 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-4445-account-create-update-k492h"] Jan 28 19:11:18 crc kubenswrapper[4749]: I0128 19:11:18.054244 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-4445-account-create-update-k492h"] Jan 28 19:11:18 crc kubenswrapper[4749]: I0128 19:11:18.067221 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-jsdx8"] Jan 28 19:11:18 crc kubenswrapper[4749]: I0128 19:11:18.884702 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302b3dea-b921-4b69-9ef4-753a3aef5a2a" path="/var/lib/kubelet/pods/302b3dea-b921-4b69-9ef4-753a3aef5a2a/volumes" Jan 28 19:11:18 crc kubenswrapper[4749]: I0128 19:11:18.885852 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea37f1a-a23e-41cb-ae60-689f5aea8631" path="/var/lib/kubelet/pods/7ea37f1a-a23e-41cb-ae60-689f5aea8631/volumes" Jan 28 19:11:18 crc kubenswrapper[4749]: I0128 19:11:18.886808 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acf9d598-9f13-42c7-9f82-9d6138a0d553" path="/var/lib/kubelet/pods/acf9d598-9f13-42c7-9f82-9d6138a0d553/volumes" Jan 28 19:11:19 crc kubenswrapper[4749]: I0128 19:11:19.394971 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prxfb" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="registry-server" probeResult="failure" output=< Jan 28 19:11:19 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:11:19 crc kubenswrapper[4749]: > Jan 28 19:11:27 crc kubenswrapper[4749]: I0128 19:11:27.368949 4749 scope.go:117] "RemoveContainer" containerID="b5cfbe0c5a127402e7ad2df9ec5c73613547fe684456155a29b15843ca5143eb" Jan 28 19:11:27 crc kubenswrapper[4749]: I0128 19:11:27.408293 4749 scope.go:117] "RemoveContainer" containerID="3bf32d6779d7074fd91e25898146450163b2578508ed8dd1a47a55bc584d0a51" Jan 28 19:11:27 crc kubenswrapper[4749]: I0128 19:11:27.474466 4749 scope.go:117] "RemoveContainer" containerID="e6287d955a75e17627fdac6dc5d624355ff29fbb5447022303ff0285d921fb70" Jan 28 19:11:27 crc kubenswrapper[4749]: I0128 19:11:27.533691 4749 scope.go:117] "RemoveContainer" containerID="f62c52e024b4d5f4042bbd1eb0be714728b16b3ffcd41161786dd875129eecb9" Jan 28 19:11:27 crc kubenswrapper[4749]: I0128 19:11:27.595508 4749 scope.go:117] "RemoveContainer" containerID="352f9833b2a26acbbdd6d3e25705e4404bdb55f7587ec441f1b36f07b86495fc" Jan 28 19:11:28 crc kubenswrapper[4749]: I0128 19:11:28.397696 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:11:28 crc kubenswrapper[4749]: I0128 19:11:28.455602 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:11:29 crc kubenswrapper[4749]: I0128 19:11:29.223263 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prxfb"] Jan 28 19:11:29 crc kubenswrapper[4749]: I0128 19:11:29.586447 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-prxfb" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="registry-server" containerID="cri-o://1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6" gracePeriod=2 Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.198596 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.288063 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-catalog-content\") pod \"41c9c2d9-743f-4b6c-98e6-2de980249549\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.288199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkbvs\" (UniqueName: \"kubernetes.io/projected/41c9c2d9-743f-4b6c-98e6-2de980249549-kube-api-access-rkbvs\") pod \"41c9c2d9-743f-4b6c-98e6-2de980249549\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.288225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-utilities\") pod \"41c9c2d9-743f-4b6c-98e6-2de980249549\" (UID: \"41c9c2d9-743f-4b6c-98e6-2de980249549\") " Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.289087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-utilities" (OuterVolumeSpecName: "utilities") pod "41c9c2d9-743f-4b6c-98e6-2de980249549" (UID: "41c9c2d9-743f-4b6c-98e6-2de980249549"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.294222 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c9c2d9-743f-4b6c-98e6-2de980249549-kube-api-access-rkbvs" (OuterVolumeSpecName: "kube-api-access-rkbvs") pod "41c9c2d9-743f-4b6c-98e6-2de980249549" (UID: "41c9c2d9-743f-4b6c-98e6-2de980249549"). InnerVolumeSpecName "kube-api-access-rkbvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.391473 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkbvs\" (UniqueName: \"kubernetes.io/projected/41c9c2d9-743f-4b6c-98e6-2de980249549-kube-api-access-rkbvs\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.391508 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.393807 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41c9c2d9-743f-4b6c-98e6-2de980249549" (UID: "41c9c2d9-743f-4b6c-98e6-2de980249549"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.493941 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c9c2d9-743f-4b6c-98e6-2de980249549-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.603183 4749 generic.go:334] "Generic (PLEG): container finished" podID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerID="1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6" exitCode=0 Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.603234 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prxfb" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.603238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prxfb" event={"ID":"41c9c2d9-743f-4b6c-98e6-2de980249549","Type":"ContainerDied","Data":"1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6"} Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.603401 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prxfb" event={"ID":"41c9c2d9-743f-4b6c-98e6-2de980249549","Type":"ContainerDied","Data":"a8cf12351d1bf02e9512a68e661b27ab12907fbbe81ac5537b51165d0baa814f"} Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.603468 4749 scope.go:117] "RemoveContainer" containerID="1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.641801 4749 scope.go:117] "RemoveContainer" containerID="b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.643163 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prxfb"] Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.653888 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-prxfb"] Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.674877 4749 scope.go:117] "RemoveContainer" containerID="ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.719797 4749 scope.go:117] "RemoveContainer" containerID="1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6" Jan 28 19:11:30 crc kubenswrapper[4749]: E0128 19:11:30.720400 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6\": container with ID starting with 1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6 not found: ID does not exist" containerID="1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.720460 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6"} err="failed to get container status \"1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6\": rpc error: code = NotFound desc = could not find container \"1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6\": container with ID starting with 1e69bd437e8fdc456eaec8aebc1d7a8d5ae4c7be26a88b070d08182494ef22c6 not found: ID does not exist" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.720495 4749 scope.go:117] "RemoveContainer" containerID="b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38" Jan 28 19:11:30 crc kubenswrapper[4749]: E0128 19:11:30.720996 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38\": container with ID starting with b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38 not found: ID does not exist" containerID="b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.721035 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38"} err="failed to get container status \"b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38\": rpc error: code = NotFound desc = could not find container \"b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38\": container with ID starting with b592752ed4a83b78a773b1dd82f33b46e82916d31dd48f32fc35e7c466500f38 not found: ID does not exist" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.721060 4749 scope.go:117] "RemoveContainer" containerID="ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae" Jan 28 19:11:30 crc kubenswrapper[4749]: E0128 19:11:30.721314 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae\": container with ID starting with ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae not found: ID does not exist" containerID="ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.721356 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae"} err="failed to get container status \"ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae\": rpc error: code = NotFound desc = could not find container \"ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae\": container with ID starting with ed5c62a951b63a3737cd8261b1503fc946ca54b05a30d8e63a04d6292a5ff1ae not found: ID does not exist" Jan 28 19:11:30 crc kubenswrapper[4749]: I0128 19:11:30.886465 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" path="/var/lib/kubelet/pods/41c9c2d9-743f-4b6c-98e6-2de980249549/volumes" Jan 28 19:11:42 crc kubenswrapper[4749]: I0128 19:11:42.042792 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-jvjll"] Jan 28 19:11:42 crc kubenswrapper[4749]: I0128 19:11:42.057989 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-jvjll"] Jan 28 19:11:42 crc kubenswrapper[4749]: I0128 19:11:42.889994 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="450c0e30-35b1-49d3-b11d-996b73fb11e8" path="/var/lib/kubelet/pods/450c0e30-35b1-49d3-b11d-996b73fb11e8/volumes" Jan 28 19:11:57 crc kubenswrapper[4749]: I0128 19:11:57.467417 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:11:57 crc kubenswrapper[4749]: I0128 19:11:57.468506 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:12:01 crc kubenswrapper[4749]: I0128 19:12:01.037533 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-27pt7"] Jan 28 19:12:01 crc kubenswrapper[4749]: I0128 19:12:01.051028 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-27pt7"] Jan 28 19:12:02 crc kubenswrapper[4749]: I0128 19:12:02.894209 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4890e1-bf10-412d-b8b1-00aac05c094c" path="/var/lib/kubelet/pods/3f4890e1-bf10-412d-b8b1-00aac05c094c/volumes" Jan 28 19:12:27 crc kubenswrapper[4749]: I0128 19:12:27.467169 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:12:27 crc kubenswrapper[4749]: I0128 19:12:27.467738 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:12:27 crc kubenswrapper[4749]: I0128 19:12:27.804383 4749 scope.go:117] "RemoveContainer" containerID="532837a3a9279aba0aaf8f5c1ca2cfeb44671c0c41344be9f94f326ca727713b" Jan 28 19:12:27 crc kubenswrapper[4749]: I0128 19:12:27.853959 4749 scope.go:117] "RemoveContainer" containerID="f0e2bb50653cae00337342616ac4b2e50a7633314f11d21f518e3bdd282e5e23" Jan 28 19:12:33 crc kubenswrapper[4749]: I0128 19:12:33.044891 4749 patch_prober.go:28] interesting pod/logging-loki-gateway-5b6db5567f-7qbgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 19:12:33 crc kubenswrapper[4749]: I0128 19:12:33.045549 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5b6db5567f-7qbgw" podUID="ca507b0e-375b-47b0-bd5e-c77f2bc7d521" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:12:36 crc kubenswrapper[4749]: I0128 19:12:36.093422 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="0229267a-385f-45ad-b285-2bb4be2c328d" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.11:8081/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:12:36 crc kubenswrapper[4749]: I0128 19:12:36.741560 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-mhc52" podUID="9cdb06ed-9c63-4f38-9276-42339904fdd0" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 19:12:57 crc kubenswrapper[4749]: I0128 19:12:57.467586 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:12:57 crc kubenswrapper[4749]: I0128 19:12:57.468045 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:12:57 crc kubenswrapper[4749]: I0128 19:12:57.468101 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:12:57 crc kubenswrapper[4749]: I0128 19:12:57.469034 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:12:57 crc kubenswrapper[4749]: I0128 19:12:57.469119 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" gracePeriod=600 Jan 28 19:12:57 crc kubenswrapper[4749]: E0128 19:12:57.593046 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:12:58 crc kubenswrapper[4749]: I0128 19:12:58.536101 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" exitCode=0 Jan 28 19:12:58 crc kubenswrapper[4749]: I0128 19:12:58.536191 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184"} Jan 28 19:12:58 crc kubenswrapper[4749]: I0128 19:12:58.536455 4749 scope.go:117] "RemoveContainer" containerID="964ceb58d124af3d9dc59099eabbe04d1465f5b7f6623d3462a8a468f8af56df" Jan 28 19:12:58 crc kubenswrapper[4749]: I0128 19:12:58.537209 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:12:58 crc kubenswrapper[4749]: E0128 19:12:58.537528 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:13:08 crc kubenswrapper[4749]: I0128 19:13:08.871767 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:13:08 crc kubenswrapper[4749]: E0128 19:13:08.872598 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:13:22 crc kubenswrapper[4749]: I0128 19:13:22.880240 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:13:22 crc kubenswrapper[4749]: E0128 19:13:22.881300 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:13:34 crc kubenswrapper[4749]: I0128 19:13:34.871669 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:13:34 crc kubenswrapper[4749]: E0128 19:13:34.872591 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:13:45 crc kubenswrapper[4749]: I0128 19:13:45.871840 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:13:45 crc kubenswrapper[4749]: E0128 19:13:45.873075 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:13:57 crc kubenswrapper[4749]: I0128 19:13:57.871438 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:13:57 crc kubenswrapper[4749]: E0128 19:13:57.872186 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:14:12 crc kubenswrapper[4749]: I0128 19:14:12.881401 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:14:12 crc kubenswrapper[4749]: E0128 19:14:12.883643 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:14:26 crc kubenswrapper[4749]: I0128 19:14:26.872357 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:14:26 crc kubenswrapper[4749]: E0128 19:14:26.873222 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:14:40 crc kubenswrapper[4749]: I0128 19:14:40.872075 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:14:40 crc kubenswrapper[4749]: E0128 19:14:40.872812 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:14:54 crc kubenswrapper[4749]: I0128 19:14:54.871840 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:14:54 crc kubenswrapper[4749]: E0128 19:14:54.872658 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.152884 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs"] Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.153968 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="extract-content" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.153984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="extract-content" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.153998 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154004 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154013 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154020 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154037 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="extract-utilities" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154044 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="extract-utilities" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154051 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="extract-content" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154057 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="extract-content" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154116 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="extract-utilities" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154124 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="extract-utilities" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154139 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154145 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154156 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="extract-content" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154161 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="extract-content" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154203 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="extract-content" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154210 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="extract-content" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154297 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="extract-utilities" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154308 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="extract-utilities" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154369 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154378 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: E0128 19:15:00.154395 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="extract-utilities" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154404 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="extract-utilities" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154772 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c9c2d9-743f-4b6c-98e6-2de980249549" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154797 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="90030828-f267-4c73-843b-b2cc161437ce" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154808 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="030b7bf4-1be7-4a1b-b5a5-56f3cc02f99c" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.154852 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0abbf8-877d-4e64-9261-2bb3b9c38be3" containerName="registry-server" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.155989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.158252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.158853 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.163920 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs"] Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.226593 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t758h\" (UniqueName: \"kubernetes.io/projected/cd0f6520-a360-40d0-a6aa-cd2775f09d94-kube-api-access-t758h\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.227082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd0f6520-a360-40d0-a6aa-cd2775f09d94-secret-volume\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.227417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd0f6520-a360-40d0-a6aa-cd2775f09d94-config-volume\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.330001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd0f6520-a360-40d0-a6aa-cd2775f09d94-secret-volume\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.330451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd0f6520-a360-40d0-a6aa-cd2775f09d94-config-volume\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.330513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t758h\" (UniqueName: \"kubernetes.io/projected/cd0f6520-a360-40d0-a6aa-cd2775f09d94-kube-api-access-t758h\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.331370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd0f6520-a360-40d0-a6aa-cd2775f09d94-config-volume\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.341198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd0f6520-a360-40d0-a6aa-cd2775f09d94-secret-volume\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.350243 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t758h\" (UniqueName: \"kubernetes.io/projected/cd0f6520-a360-40d0-a6aa-cd2775f09d94-kube-api-access-t758h\") pod \"collect-profiles-29493795-4zcfs\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:00 crc kubenswrapper[4749]: I0128 19:15:00.487437 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:01 crc kubenswrapper[4749]: I0128 19:15:01.052592 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs"] Jan 28 19:15:01 crc kubenswrapper[4749]: I0128 19:15:01.771805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" event={"ID":"cd0f6520-a360-40d0-a6aa-cd2775f09d94","Type":"ContainerStarted","Data":"a5cdfa449272e7041b744cab312a5929dd822a46450235edb04fa6426768b3ab"} Jan 28 19:15:01 crc kubenswrapper[4749]: I0128 19:15:01.772138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" event={"ID":"cd0f6520-a360-40d0-a6aa-cd2775f09d94","Type":"ContainerStarted","Data":"8266053d3f13217bf543226a8e37bb4b52d02f7a85caac19b7cfbf97b7041c88"} Jan 28 19:15:01 crc kubenswrapper[4749]: I0128 19:15:01.786699 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" podStartSLOduration=1.786680871 podStartE2EDuration="1.786680871s" podCreationTimestamp="2026-01-28 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 19:15:01.784183139 +0000 UTC m=+2369.795709924" watchObservedRunningTime="2026-01-28 19:15:01.786680871 +0000 UTC m=+2369.798207646" Jan 28 19:15:02 crc kubenswrapper[4749]: I0128 19:15:02.787605 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd0f6520-a360-40d0-a6aa-cd2775f09d94" containerID="a5cdfa449272e7041b744cab312a5929dd822a46450235edb04fa6426768b3ab" exitCode=0 Jan 28 19:15:02 crc kubenswrapper[4749]: I0128 19:15:02.787963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" event={"ID":"cd0f6520-a360-40d0-a6aa-cd2775f09d94","Type":"ContainerDied","Data":"a5cdfa449272e7041b744cab312a5929dd822a46450235edb04fa6426768b3ab"} Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.230642 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.340845 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t758h\" (UniqueName: \"kubernetes.io/projected/cd0f6520-a360-40d0-a6aa-cd2775f09d94-kube-api-access-t758h\") pod \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.341399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd0f6520-a360-40d0-a6aa-cd2775f09d94-config-volume\") pod \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.341518 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd0f6520-a360-40d0-a6aa-cd2775f09d94-secret-volume\") pod \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\" (UID: \"cd0f6520-a360-40d0-a6aa-cd2775f09d94\") " Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.342233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0f6520-a360-40d0-a6aa-cd2775f09d94-config-volume" (OuterVolumeSpecName: "config-volume") pod "cd0f6520-a360-40d0-a6aa-cd2775f09d94" (UID: "cd0f6520-a360-40d0-a6aa-cd2775f09d94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.347217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd0f6520-a360-40d0-a6aa-cd2775f09d94-kube-api-access-t758h" (OuterVolumeSpecName: "kube-api-access-t758h") pod "cd0f6520-a360-40d0-a6aa-cd2775f09d94" (UID: "cd0f6520-a360-40d0-a6aa-cd2775f09d94"). InnerVolumeSpecName "kube-api-access-t758h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.347503 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd0f6520-a360-40d0-a6aa-cd2775f09d94-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cd0f6520-a360-40d0-a6aa-cd2775f09d94" (UID: "cd0f6520-a360-40d0-a6aa-cd2775f09d94"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.444235 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t758h\" (UniqueName: \"kubernetes.io/projected/cd0f6520-a360-40d0-a6aa-cd2775f09d94-kube-api-access-t758h\") on node \"crc\" DevicePath \"\"" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.444272 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd0f6520-a360-40d0-a6aa-cd2775f09d94-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.444281 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd0f6520-a360-40d0-a6aa-cd2775f09d94-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.809815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" event={"ID":"cd0f6520-a360-40d0-a6aa-cd2775f09d94","Type":"ContainerDied","Data":"8266053d3f13217bf543226a8e37bb4b52d02f7a85caac19b7cfbf97b7041c88"} Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.809866 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8266053d3f13217bf543226a8e37bb4b52d02f7a85caac19b7cfbf97b7041c88" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.810188 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs" Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.904439 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj"] Jan 28 19:15:04 crc kubenswrapper[4749]: I0128 19:15:04.918416 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493750-s9qvj"] Jan 28 19:15:06 crc kubenswrapper[4749]: I0128 19:15:06.872202 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:15:06 crc kubenswrapper[4749]: E0128 19:15:06.873013 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:15:06 crc kubenswrapper[4749]: I0128 19:15:06.887292 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b018047-b659-49f1-a494-aaf29e2925e3" path="/var/lib/kubelet/pods/2b018047-b659-49f1-a494-aaf29e2925e3/volumes" Jan 28 19:15:17 crc kubenswrapper[4749]: I0128 19:15:17.871709 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:15:17 crc kubenswrapper[4749]: E0128 19:15:17.872442 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:15:28 crc kubenswrapper[4749]: I0128 19:15:28.017813 4749 scope.go:117] "RemoveContainer" containerID="d8aca8147ef4c290231677590ba0026f2ec718ce5a59b716814f7bbb578df6a0" Jan 28 19:15:28 crc kubenswrapper[4749]: I0128 19:15:28.872746 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:15:28 crc kubenswrapper[4749]: E0128 19:15:28.873659 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:15:39 crc kubenswrapper[4749]: I0128 19:15:39.871711 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:15:39 crc kubenswrapper[4749]: E0128 19:15:39.872539 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:15:50 crc kubenswrapper[4749]: I0128 19:15:50.871538 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:15:50 crc kubenswrapper[4749]: E0128 19:15:50.872217 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:16:04 crc kubenswrapper[4749]: I0128 19:16:04.872108 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:16:04 crc kubenswrapper[4749]: E0128 19:16:04.872948 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:16:17 crc kubenswrapper[4749]: I0128 19:16:17.871749 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:16:17 crc kubenswrapper[4749]: E0128 19:16:17.872583 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:16:30 crc kubenswrapper[4749]: I0128 19:16:30.872834 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:16:30 crc kubenswrapper[4749]: E0128 19:16:30.873721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:16:45 crc kubenswrapper[4749]: I0128 19:16:45.872889 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:16:45 crc kubenswrapper[4749]: E0128 19:16:45.873711 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:16:58 crc kubenswrapper[4749]: I0128 19:16:58.871441 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:16:58 crc kubenswrapper[4749]: E0128 19:16:58.872109 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:17:12 crc kubenswrapper[4749]: I0128 19:17:12.879676 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:17:12 crc kubenswrapper[4749]: E0128 19:17:12.880754 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:17:26 crc kubenswrapper[4749]: I0128 19:17:26.871876 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:17:26 crc kubenswrapper[4749]: E0128 19:17:26.872729 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:17:28 crc kubenswrapper[4749]: I0128 19:17:28.124696 4749 scope.go:117] "RemoveContainer" containerID="22ba2cd46ae52a215807f18e8a93072a412a4cf8de274c5563e748e1ec3fe562" Jan 28 19:17:28 crc kubenswrapper[4749]: I0128 19:17:28.146398 4749 scope.go:117] "RemoveContainer" containerID="7a69fd8a9ba21f4203f9323bd979df0a95c25a619aaa2f45c96d2332bdef43ff" Jan 28 19:17:28 crc kubenswrapper[4749]: I0128 19:17:28.173644 4749 scope.go:117] "RemoveContainer" containerID="44cc35ad966217ba1846b545827429425271f5581df3b36dcfa72f5567244f51" Jan 28 19:17:39 crc kubenswrapper[4749]: I0128 19:17:39.872771 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:17:39 crc kubenswrapper[4749]: E0128 19:17:39.874529 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:17:52 crc kubenswrapper[4749]: I0128 19:17:52.884108 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:17:52 crc kubenswrapper[4749]: E0128 19:17:52.888267 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:18:03 crc kubenswrapper[4749]: I0128 19:18:03.871729 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:18:04 crc kubenswrapper[4749]: I0128 19:18:04.514223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"ccb1545741042615267f506b4475d7829e628c23e505c88b0c3a373d66f36c21"} Jan 28 19:20:27 crc kubenswrapper[4749]: I0128 19:20:27.467606 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:20:27 crc kubenswrapper[4749]: I0128 19:20:27.468179 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:20:37 crc kubenswrapper[4749]: I0128 19:20:37.472521 4749 patch_prober.go:28] interesting pod/metrics-server-555d778d79-lwsht container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 19:20:37 crc kubenswrapper[4749]: I0128 19:20:37.473204 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-555d778d79-lwsht" podUID="559588ad-0b94-4587-9a9c-94fae9fdd016" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.206265 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xrrpl"] Jan 28 19:20:45 crc kubenswrapper[4749]: E0128 19:20:45.207490 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd0f6520-a360-40d0-a6aa-cd2775f09d94" containerName="collect-profiles" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.207507 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0f6520-a360-40d0-a6aa-cd2775f09d94" containerName="collect-profiles" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.207778 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd0f6520-a360-40d0-a6aa-cd2775f09d94" containerName="collect-profiles" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.209612 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.217583 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xrrpl"] Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.312205 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-catalog-content\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.312262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtws\" (UniqueName: \"kubernetes.io/projected/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-kube-api-access-sdtws\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.312535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-utilities\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.414620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-utilities\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.414798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-catalog-content\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.414830 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtws\" (UniqueName: \"kubernetes.io/projected/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-kube-api-access-sdtws\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.415165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-utilities\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.415235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-catalog-content\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.434107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtws\" (UniqueName: \"kubernetes.io/projected/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-kube-api-access-sdtws\") pod \"certified-operators-xrrpl\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:45 crc kubenswrapper[4749]: I0128 19:20:45.528241 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:20:46 crc kubenswrapper[4749]: I0128 19:20:46.056293 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xrrpl"] Jan 28 19:20:46 crc kubenswrapper[4749]: I0128 19:20:46.109282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrrpl" event={"ID":"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5","Type":"ContainerStarted","Data":"2ca4ef68bb2c118ecb4d06db22546ed4567fc21eca5e1b9198e97ecbb121a115"} Jan 28 19:20:47 crc kubenswrapper[4749]: I0128 19:20:47.121242 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerID="0917a946e78234f509ed37f928745c3182b59516da97321a6a92e723b2e135cd" exitCode=0 Jan 28 19:20:47 crc kubenswrapper[4749]: I0128 19:20:47.121318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrrpl" event={"ID":"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5","Type":"ContainerDied","Data":"0917a946e78234f509ed37f928745c3182b59516da97321a6a92e723b2e135cd"} Jan 28 19:20:47 crc kubenswrapper[4749]: I0128 19:20:47.123426 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 19:20:49 crc kubenswrapper[4749]: I0128 19:20:49.144275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrrpl" event={"ID":"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5","Type":"ContainerStarted","Data":"5f04b3d75157b21b13269a33c328ead9d6a99b9ede835d27361b539a5a340a8b"} Jan 28 19:20:56 crc kubenswrapper[4749]: I0128 19:20:56.212356 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerID="5f04b3d75157b21b13269a33c328ead9d6a99b9ede835d27361b539a5a340a8b" exitCode=0 Jan 28 19:20:56 crc kubenswrapper[4749]: I0128 19:20:56.212445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrrpl" event={"ID":"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5","Type":"ContainerDied","Data":"5f04b3d75157b21b13269a33c328ead9d6a99b9ede835d27361b539a5a340a8b"} Jan 28 19:20:57 crc kubenswrapper[4749]: I0128 19:20:57.224141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrrpl" event={"ID":"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5","Type":"ContainerStarted","Data":"19a9e830da88f9bb4e84b29fedb766712585dc673c97068918265b07fc0186cc"} Jan 28 19:20:57 crc kubenswrapper[4749]: I0128 19:20:57.241734 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xrrpl" podStartSLOduration=2.762060628 podStartE2EDuration="12.241714232s" podCreationTimestamp="2026-01-28 19:20:45 +0000 UTC" firstStartedPulling="2026-01-28 19:20:47.123088791 +0000 UTC m=+2715.134615576" lastFinishedPulling="2026-01-28 19:20:56.602742405 +0000 UTC m=+2724.614269180" observedRunningTime="2026-01-28 19:20:57.23958669 +0000 UTC m=+2725.251113465" watchObservedRunningTime="2026-01-28 19:20:57.241714232 +0000 UTC m=+2725.253241007" Jan 28 19:20:57 crc kubenswrapper[4749]: I0128 19:20:57.466978 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:20:57 crc kubenswrapper[4749]: I0128 19:20:57.467056 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.210287 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v92t6"] Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.213597 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.241768 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v92t6"] Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.256881 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-utilities\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.257032 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzccw\" (UniqueName: \"kubernetes.io/projected/a207d291-61bc-4caf-becc-ddde78c4a044-kube-api-access-zzccw\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.257250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-catalog-content\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.359741 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzccw\" (UniqueName: \"kubernetes.io/projected/a207d291-61bc-4caf-becc-ddde78c4a044-kube-api-access-zzccw\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.360147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-catalog-content\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.360309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-utilities\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.360817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-utilities\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.360860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-catalog-content\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.402783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzccw\" (UniqueName: \"kubernetes.io/projected/a207d291-61bc-4caf-becc-ddde78c4a044-kube-api-access-zzccw\") pod \"redhat-marketplace-v92t6\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:04 crc kubenswrapper[4749]: I0128 19:21:04.542935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:05 crc kubenswrapper[4749]: I0128 19:21:05.166667 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v92t6"] Jan 28 19:21:05 crc kubenswrapper[4749]: W0128 19:21:05.168288 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda207d291_61bc_4caf_becc_ddde78c4a044.slice/crio-03575d8d874825ca10668c6b8d9fdee432f3d67180a81193ab500be41e6da4f5 WatchSource:0}: Error finding container 03575d8d874825ca10668c6b8d9fdee432f3d67180a81193ab500be41e6da4f5: Status 404 returned error can't find the container with id 03575d8d874825ca10668c6b8d9fdee432f3d67180a81193ab500be41e6da4f5 Jan 28 19:21:05 crc kubenswrapper[4749]: I0128 19:21:05.299942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92t6" event={"ID":"a207d291-61bc-4caf-becc-ddde78c4a044","Type":"ContainerStarted","Data":"03575d8d874825ca10668c6b8d9fdee432f3d67180a81193ab500be41e6da4f5"} Jan 28 19:21:05 crc kubenswrapper[4749]: I0128 19:21:05.529045 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:21:05 crc kubenswrapper[4749]: I0128 19:21:05.529100 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:21:06 crc kubenswrapper[4749]: I0128 19:21:06.309306 4749 generic.go:334] "Generic (PLEG): container finished" podID="a207d291-61bc-4caf-becc-ddde78c4a044" containerID="bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058" exitCode=0 Jan 28 19:21:06 crc kubenswrapper[4749]: I0128 19:21:06.309375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92t6" event={"ID":"a207d291-61bc-4caf-becc-ddde78c4a044","Type":"ContainerDied","Data":"bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058"} Jan 28 19:21:06 crc kubenswrapper[4749]: I0128 19:21:06.573879 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xrrpl" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="registry-server" probeResult="failure" output=< Jan 28 19:21:06 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:21:06 crc kubenswrapper[4749]: > Jan 28 19:21:07 crc kubenswrapper[4749]: I0128 19:21:07.321818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92t6" event={"ID":"a207d291-61bc-4caf-becc-ddde78c4a044","Type":"ContainerStarted","Data":"7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b"} Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.199120 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kr4lr"] Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.202383 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.257891 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kr4lr"] Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.262022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-utilities\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.262576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bntlw\" (UniqueName: \"kubernetes.io/projected/8d9406d0-b104-4d14-acf0-b0a04349d828-kube-api-access-bntlw\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.262702 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-catalog-content\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.363703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-utilities\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.364225 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-utilities\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.364591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bntlw\" (UniqueName: \"kubernetes.io/projected/8d9406d0-b104-4d14-acf0-b0a04349d828-kube-api-access-bntlw\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.364789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-catalog-content\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.365061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-catalog-content\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.389995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bntlw\" (UniqueName: \"kubernetes.io/projected/8d9406d0-b104-4d14-acf0-b0a04349d828-kube-api-access-bntlw\") pod \"redhat-operators-kr4lr\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:08 crc kubenswrapper[4749]: I0128 19:21:08.536552 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:09 crc kubenswrapper[4749]: I0128 19:21:09.084881 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kr4lr"] Jan 28 19:21:09 crc kubenswrapper[4749]: I0128 19:21:09.354287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kr4lr" event={"ID":"8d9406d0-b104-4d14-acf0-b0a04349d828","Type":"ContainerStarted","Data":"a274fc4a885137ff6a7ea750dd3270dac769250bc4e58b402a2fd505e95b8357"} Jan 28 19:21:09 crc kubenswrapper[4749]: I0128 19:21:09.361577 4749 generic.go:334] "Generic (PLEG): container finished" podID="a207d291-61bc-4caf-becc-ddde78c4a044" containerID="7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b" exitCode=0 Jan 28 19:21:09 crc kubenswrapper[4749]: I0128 19:21:09.361666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92t6" event={"ID":"a207d291-61bc-4caf-becc-ddde78c4a044","Type":"ContainerDied","Data":"7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b"} Jan 28 19:21:10 crc kubenswrapper[4749]: I0128 19:21:10.376191 4749 generic.go:334] "Generic (PLEG): container finished" podID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerID="b65bee7fc629576c1d7e96794df755ac6e443f4e65300880d56bd8004063adf9" exitCode=0 Jan 28 19:21:10 crc kubenswrapper[4749]: I0128 19:21:10.376271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kr4lr" event={"ID":"8d9406d0-b104-4d14-acf0-b0a04349d828","Type":"ContainerDied","Data":"b65bee7fc629576c1d7e96794df755ac6e443f4e65300880d56bd8004063adf9"} Jan 28 19:21:12 crc kubenswrapper[4749]: I0128 19:21:12.399272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kr4lr" event={"ID":"8d9406d0-b104-4d14-acf0-b0a04349d828","Type":"ContainerStarted","Data":"cb0ded056dbf664cf4746b2ea85be2efa31869bcd6a112c0b4e5ce177f47d632"} Jan 28 19:21:12 crc kubenswrapper[4749]: I0128 19:21:12.403197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92t6" event={"ID":"a207d291-61bc-4caf-becc-ddde78c4a044","Type":"ContainerStarted","Data":"0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30"} Jan 28 19:21:12 crc kubenswrapper[4749]: I0128 19:21:12.451954 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v92t6" podStartSLOduration=3.665218465 podStartE2EDuration="8.451933483s" podCreationTimestamp="2026-01-28 19:21:04 +0000 UTC" firstStartedPulling="2026-01-28 19:21:06.312537567 +0000 UTC m=+2734.324064342" lastFinishedPulling="2026-01-28 19:21:11.099252585 +0000 UTC m=+2739.110779360" observedRunningTime="2026-01-28 19:21:12.444707047 +0000 UTC m=+2740.456233842" watchObservedRunningTime="2026-01-28 19:21:12.451933483 +0000 UTC m=+2740.463460258" Jan 28 19:21:14 crc kubenswrapper[4749]: I0128 19:21:14.548723 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:14 crc kubenswrapper[4749]: I0128 19:21:14.549541 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:15 crc kubenswrapper[4749]: I0128 19:21:15.581269 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:21:15 crc kubenswrapper[4749]: I0128 19:21:15.609699 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-v92t6" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="registry-server" probeResult="failure" output=< Jan 28 19:21:15 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:21:15 crc kubenswrapper[4749]: > Jan 28 19:21:15 crc kubenswrapper[4749]: I0128 19:21:15.632318 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:21:16 crc kubenswrapper[4749]: I0128 19:21:16.787261 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xrrpl"] Jan 28 19:21:17 crc kubenswrapper[4749]: I0128 19:21:17.450441 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xrrpl" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="registry-server" containerID="cri-o://19a9e830da88f9bb4e84b29fedb766712585dc673c97068918265b07fc0186cc" gracePeriod=2 Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.471958 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerID="19a9e830da88f9bb4e84b29fedb766712585dc673c97068918265b07fc0186cc" exitCode=0 Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.472497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrrpl" event={"ID":"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5","Type":"ContainerDied","Data":"19a9e830da88f9bb4e84b29fedb766712585dc673c97068918265b07fc0186cc"} Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.604123 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.680736 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdtws\" (UniqueName: \"kubernetes.io/projected/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-kube-api-access-sdtws\") pod \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.680991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-catalog-content\") pod \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.681086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-utilities\") pod \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\" (UID: \"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5\") " Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.682461 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-utilities" (OuterVolumeSpecName: "utilities") pod "5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" (UID: "5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.700623 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-kube-api-access-sdtws" (OuterVolumeSpecName: "kube-api-access-sdtws") pod "5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" (UID: "5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5"). InnerVolumeSpecName "kube-api-access-sdtws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.764913 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" (UID: "5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.783923 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.783966 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:19 crc kubenswrapper[4749]: I0128 19:21:19.783976 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdtws\" (UniqueName: \"kubernetes.io/projected/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5-kube-api-access-sdtws\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:20 crc kubenswrapper[4749]: I0128 19:21:20.488351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrrpl" event={"ID":"5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5","Type":"ContainerDied","Data":"2ca4ef68bb2c118ecb4d06db22546ed4567fc21eca5e1b9198e97ecbb121a115"} Jan 28 19:21:20 crc kubenswrapper[4749]: I0128 19:21:20.488703 4749 scope.go:117] "RemoveContainer" containerID="19a9e830da88f9bb4e84b29fedb766712585dc673c97068918265b07fc0186cc" Jan 28 19:21:20 crc kubenswrapper[4749]: I0128 19:21:20.488963 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrrpl" Jan 28 19:21:20 crc kubenswrapper[4749]: I0128 19:21:20.521252 4749 scope.go:117] "RemoveContainer" containerID="5f04b3d75157b21b13269a33c328ead9d6a99b9ede835d27361b539a5a340a8b" Jan 28 19:21:20 crc kubenswrapper[4749]: I0128 19:21:20.539605 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xrrpl"] Jan 28 19:21:20 crc kubenswrapper[4749]: I0128 19:21:20.551616 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xrrpl"] Jan 28 19:21:20 crc kubenswrapper[4749]: I0128 19:21:20.560215 4749 scope.go:117] "RemoveContainer" containerID="0917a946e78234f509ed37f928745c3182b59516da97321a6a92e723b2e135cd" Jan 28 19:21:20 crc kubenswrapper[4749]: I0128 19:21:20.887979 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" path="/var/lib/kubelet/pods/5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5/volumes" Jan 28 19:21:25 crc kubenswrapper[4749]: I0128 19:21:25.596152 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-v92t6" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="registry-server" probeResult="failure" output=< Jan 28 19:21:25 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:21:25 crc kubenswrapper[4749]: > Jan 28 19:21:27 crc kubenswrapper[4749]: I0128 19:21:27.467796 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:21:27 crc kubenswrapper[4749]: I0128 19:21:27.468367 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:21:27 crc kubenswrapper[4749]: I0128 19:21:27.468425 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:21:27 crc kubenswrapper[4749]: I0128 19:21:27.469452 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ccb1545741042615267f506b4475d7829e628c23e505c88b0c3a373d66f36c21"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:21:27 crc kubenswrapper[4749]: I0128 19:21:27.469521 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://ccb1545741042615267f506b4475d7829e628c23e505c88b0c3a373d66f36c21" gracePeriod=600 Jan 28 19:21:27 crc kubenswrapper[4749]: I0128 19:21:27.562179 4749 generic.go:334] "Generic (PLEG): container finished" podID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerID="cb0ded056dbf664cf4746b2ea85be2efa31869bcd6a112c0b4e5ce177f47d632" exitCode=0 Jan 28 19:21:27 crc kubenswrapper[4749]: I0128 19:21:27.562222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kr4lr" event={"ID":"8d9406d0-b104-4d14-acf0-b0a04349d828","Type":"ContainerDied","Data":"cb0ded056dbf664cf4746b2ea85be2efa31869bcd6a112c0b4e5ce177f47d632"} Jan 28 19:21:28 crc kubenswrapper[4749]: I0128 19:21:28.573468 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="ccb1545741042615267f506b4475d7829e628c23e505c88b0c3a373d66f36c21" exitCode=0 Jan 28 19:21:28 crc kubenswrapper[4749]: I0128 19:21:28.573532 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"ccb1545741042615267f506b4475d7829e628c23e505c88b0c3a373d66f36c21"} Jan 28 19:21:28 crc kubenswrapper[4749]: I0128 19:21:28.573986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e"} Jan 28 19:21:28 crc kubenswrapper[4749]: I0128 19:21:28.574008 4749 scope.go:117] "RemoveContainer" containerID="99661bd15298276bf4f1175c866c77a3b0dffebbf163d73cfe6bd20b981ca184" Jan 28 19:21:29 crc kubenswrapper[4749]: I0128 19:21:29.586848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kr4lr" event={"ID":"8d9406d0-b104-4d14-acf0-b0a04349d828","Type":"ContainerStarted","Data":"ce74d9485a5b772d2e61f4930c0120ef63699fe14025c54c5131ade367ff39b9"} Jan 28 19:21:29 crc kubenswrapper[4749]: I0128 19:21:29.610585 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kr4lr" podStartSLOduration=3.308345325 podStartE2EDuration="21.610563503s" podCreationTimestamp="2026-01-28 19:21:08 +0000 UTC" firstStartedPulling="2026-01-28 19:21:10.382212564 +0000 UTC m=+2738.393739339" lastFinishedPulling="2026-01-28 19:21:28.684430742 +0000 UTC m=+2756.695957517" observedRunningTime="2026-01-28 19:21:29.601708998 +0000 UTC m=+2757.613235843" watchObservedRunningTime="2026-01-28 19:21:29.610563503 +0000 UTC m=+2757.622090278" Jan 28 19:21:34 crc kubenswrapper[4749]: I0128 19:21:34.589874 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:34 crc kubenswrapper[4749]: I0128 19:21:34.658318 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:35 crc kubenswrapper[4749]: I0128 19:21:35.411650 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v92t6"] Jan 28 19:21:35 crc kubenswrapper[4749]: I0128 19:21:35.643444 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v92t6" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="registry-server" containerID="cri-o://0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30" gracePeriod=2 Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.406135 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.508392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-catalog-content\") pod \"a207d291-61bc-4caf-becc-ddde78c4a044\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.508754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzccw\" (UniqueName: \"kubernetes.io/projected/a207d291-61bc-4caf-becc-ddde78c4a044-kube-api-access-zzccw\") pod \"a207d291-61bc-4caf-becc-ddde78c4a044\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.508824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-utilities\") pod \"a207d291-61bc-4caf-becc-ddde78c4a044\" (UID: \"a207d291-61bc-4caf-becc-ddde78c4a044\") " Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.509664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-utilities" (OuterVolumeSpecName: "utilities") pod "a207d291-61bc-4caf-becc-ddde78c4a044" (UID: "a207d291-61bc-4caf-becc-ddde78c4a044"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.514668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a207d291-61bc-4caf-becc-ddde78c4a044-kube-api-access-zzccw" (OuterVolumeSpecName: "kube-api-access-zzccw") pod "a207d291-61bc-4caf-becc-ddde78c4a044" (UID: "a207d291-61bc-4caf-becc-ddde78c4a044"). InnerVolumeSpecName "kube-api-access-zzccw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.552768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a207d291-61bc-4caf-becc-ddde78c4a044" (UID: "a207d291-61bc-4caf-becc-ddde78c4a044"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.613132 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.613186 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzccw\" (UniqueName: \"kubernetes.io/projected/a207d291-61bc-4caf-becc-ddde78c4a044-kube-api-access-zzccw\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.613207 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a207d291-61bc-4caf-becc-ddde78c4a044-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.655637 4749 generic.go:334] "Generic (PLEG): container finished" podID="a207d291-61bc-4caf-becc-ddde78c4a044" containerID="0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30" exitCode=0 Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.655678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92t6" event={"ID":"a207d291-61bc-4caf-becc-ddde78c4a044","Type":"ContainerDied","Data":"0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30"} Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.655710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92t6" event={"ID":"a207d291-61bc-4caf-becc-ddde78c4a044","Type":"ContainerDied","Data":"03575d8d874825ca10668c6b8d9fdee432f3d67180a81193ab500be41e6da4f5"} Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.655728 4749 scope.go:117] "RemoveContainer" containerID="0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.655725 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v92t6" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.684907 4749 scope.go:117] "RemoveContainer" containerID="7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.699657 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v92t6"] Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.708230 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v92t6"] Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.725493 4749 scope.go:117] "RemoveContainer" containerID="bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.763707 4749 scope.go:117] "RemoveContainer" containerID="0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30" Jan 28 19:21:36 crc kubenswrapper[4749]: E0128 19:21:36.764195 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30\": container with ID starting with 0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30 not found: ID does not exist" containerID="0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.764256 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30"} err="failed to get container status \"0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30\": rpc error: code = NotFound desc = could not find container \"0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30\": container with ID starting with 0c7401a4e49a1269cd571591e22948860e94e2cf3c84d605db8e8afa5a910f30 not found: ID does not exist" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.764292 4749 scope.go:117] "RemoveContainer" containerID="7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b" Jan 28 19:21:36 crc kubenswrapper[4749]: E0128 19:21:36.764860 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b\": container with ID starting with 7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b not found: ID does not exist" containerID="7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.764906 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b"} err="failed to get container status \"7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b\": rpc error: code = NotFound desc = could not find container \"7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b\": container with ID starting with 7da4dd176f502353923f6d7a26aeaa6297dca2d1ea539621501167a589110c7b not found: ID does not exist" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.764934 4749 scope.go:117] "RemoveContainer" containerID="bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058" Jan 28 19:21:36 crc kubenswrapper[4749]: E0128 19:21:36.765228 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058\": container with ID starting with bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058 not found: ID does not exist" containerID="bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.765283 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058"} err="failed to get container status \"bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058\": rpc error: code = NotFound desc = could not find container \"bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058\": container with ID starting with bc946dfc31a31812c9a2e1a596ae56436d9ae85456d99d612d1147cedb04f058 not found: ID does not exist" Jan 28 19:21:36 crc kubenswrapper[4749]: I0128 19:21:36.886193 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" path="/var/lib/kubelet/pods/a207d291-61bc-4caf-becc-ddde78c4a044/volumes" Jan 28 19:21:38 crc kubenswrapper[4749]: I0128 19:21:38.537117 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:38 crc kubenswrapper[4749]: I0128 19:21:38.537932 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:38 crc kubenswrapper[4749]: I0128 19:21:38.593242 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:38 crc kubenswrapper[4749]: I0128 19:21:38.736387 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:39 crc kubenswrapper[4749]: I0128 19:21:39.818289 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kr4lr"] Jan 28 19:21:40 crc kubenswrapper[4749]: I0128 19:21:40.697569 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kr4lr" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerName="registry-server" containerID="cri-o://ce74d9485a5b772d2e61f4930c0120ef63699fe14025c54c5131ade367ff39b9" gracePeriod=2 Jan 28 19:21:41 crc kubenswrapper[4749]: I0128 19:21:41.710829 4749 generic.go:334] "Generic (PLEG): container finished" podID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerID="ce74d9485a5b772d2e61f4930c0120ef63699fe14025c54c5131ade367ff39b9" exitCode=0 Jan 28 19:21:41 crc kubenswrapper[4749]: I0128 19:21:41.711066 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kr4lr" event={"ID":"8d9406d0-b104-4d14-acf0-b0a04349d828","Type":"ContainerDied","Data":"ce74d9485a5b772d2e61f4930c0120ef63699fe14025c54c5131ade367ff39b9"} Jan 28 19:21:42 crc kubenswrapper[4749]: I0128 19:21:42.935741 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.065774 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-catalog-content\") pod \"8d9406d0-b104-4d14-acf0-b0a04349d828\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.065968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-utilities\") pod \"8d9406d0-b104-4d14-acf0-b0a04349d828\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.066055 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bntlw\" (UniqueName: \"kubernetes.io/projected/8d9406d0-b104-4d14-acf0-b0a04349d828-kube-api-access-bntlw\") pod \"8d9406d0-b104-4d14-acf0-b0a04349d828\" (UID: \"8d9406d0-b104-4d14-acf0-b0a04349d828\") " Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.068296 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-utilities" (OuterVolumeSpecName: "utilities") pod "8d9406d0-b104-4d14-acf0-b0a04349d828" (UID: "8d9406d0-b104-4d14-acf0-b0a04349d828"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.077030 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9406d0-b104-4d14-acf0-b0a04349d828-kube-api-access-bntlw" (OuterVolumeSpecName: "kube-api-access-bntlw") pod "8d9406d0-b104-4d14-acf0-b0a04349d828" (UID: "8d9406d0-b104-4d14-acf0-b0a04349d828"). InnerVolumeSpecName "kube-api-access-bntlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.169448 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.169494 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bntlw\" (UniqueName: \"kubernetes.io/projected/8d9406d0-b104-4d14-acf0-b0a04349d828-kube-api-access-bntlw\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.210458 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d9406d0-b104-4d14-acf0-b0a04349d828" (UID: "8d9406d0-b104-4d14-acf0-b0a04349d828"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.272002 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9406d0-b104-4d14-acf0-b0a04349d828-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.732745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kr4lr" event={"ID":"8d9406d0-b104-4d14-acf0-b0a04349d828","Type":"ContainerDied","Data":"a274fc4a885137ff6a7ea750dd3270dac769250bc4e58b402a2fd505e95b8357"} Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.732824 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kr4lr" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.733058 4749 scope.go:117] "RemoveContainer" containerID="ce74d9485a5b772d2e61f4930c0120ef63699fe14025c54c5131ade367ff39b9" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.768189 4749 scope.go:117] "RemoveContainer" containerID="cb0ded056dbf664cf4746b2ea85be2efa31869bcd6a112c0b4e5ce177f47d632" Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.782078 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kr4lr"] Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.793707 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kr4lr"] Jan 28 19:21:43 crc kubenswrapper[4749]: I0128 19:21:43.795055 4749 scope.go:117] "RemoveContainer" containerID="b65bee7fc629576c1d7e96794df755ac6e443f4e65300880d56bd8004063adf9" Jan 28 19:21:44 crc kubenswrapper[4749]: I0128 19:21:44.883257 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" path="/var/lib/kubelet/pods/8d9406d0-b104-4d14-acf0-b0a04349d828/volumes" Jan 28 19:21:55 crc kubenswrapper[4749]: I0128 19:21:55.760525 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="3212c9f3-5620-46b0-bece-ec7ea4b9763a" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 28 19:21:57 crc kubenswrapper[4749]: I0128 19:21:57.562130 4749 trace.go:236] Trace[1313929833]: "Calculate volume metrics of config for pod openshift-controller-manager/controller-manager-79558b8d74-847nd" (28-Jan-2026 19:21:54.198) (total time: 3363ms): Jan 28 19:21:57 crc kubenswrapper[4749]: Trace[1313929833]: [3.363390568s] [3.363390568s] END Jan 28 19:21:57 crc kubenswrapper[4749]: I0128 19:21:57.570602 4749 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 6.942766004s: [/var/lib/containers/storage/overlay/4f1a920799d6c95f0577922d5b135660d6135bbb96e9333c66b3f176065805cf/diff /var/log/pods/openstack_keystone-54bd464d95-gdqhz_0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae/keystone-api/0.log]; will not log again for this container unless duration exceeds 2s Jan 28 19:23:27 crc kubenswrapper[4749]: I0128 19:23:27.467286 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:23:27 crc kubenswrapper[4749]: I0128 19:23:27.468089 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:23:57 crc kubenswrapper[4749]: I0128 19:23:57.466887 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:23:57 crc kubenswrapper[4749]: I0128 19:23:57.467367 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:24:27 crc kubenswrapper[4749]: I0128 19:24:27.466983 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:24:27 crc kubenswrapper[4749]: I0128 19:24:27.467684 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:24:27 crc kubenswrapper[4749]: I0128 19:24:27.467729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:24:27 crc kubenswrapper[4749]: I0128 19:24:27.468581 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:24:27 crc kubenswrapper[4749]: I0128 19:24:27.468631 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" gracePeriod=600 Jan 28 19:24:28 crc kubenswrapper[4749]: E0128 19:24:28.103987 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:24:28 crc kubenswrapper[4749]: I0128 19:24:28.333902 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" exitCode=0 Jan 28 19:24:28 crc kubenswrapper[4749]: I0128 19:24:28.333984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e"} Jan 28 19:24:28 crc kubenswrapper[4749]: I0128 19:24:28.334273 4749 scope.go:117] "RemoveContainer" containerID="ccb1545741042615267f506b4475d7829e628c23e505c88b0c3a373d66f36c21" Jan 28 19:24:28 crc kubenswrapper[4749]: I0128 19:24:28.335127 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:24:28 crc kubenswrapper[4749]: E0128 19:24:28.335526 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:24:43 crc kubenswrapper[4749]: I0128 19:24:43.871890 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:24:43 crc kubenswrapper[4749]: E0128 19:24:43.872924 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:24:57 crc kubenswrapper[4749]: I0128 19:24:57.872272 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:24:57 crc kubenswrapper[4749]: E0128 19:24:57.873362 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:25:08 crc kubenswrapper[4749]: I0128 19:25:08.872430 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:25:08 crc kubenswrapper[4749]: E0128 19:25:08.874817 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:25:22 crc kubenswrapper[4749]: I0128 19:25:22.871928 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:25:22 crc kubenswrapper[4749]: E0128 19:25:22.873465 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:25:35 crc kubenswrapper[4749]: I0128 19:25:35.872237 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:25:35 crc kubenswrapper[4749]: E0128 19:25:35.872948 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:25:47 crc kubenswrapper[4749]: I0128 19:25:47.871750 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:25:47 crc kubenswrapper[4749]: E0128 19:25:47.872623 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:25:59 crc kubenswrapper[4749]: I0128 19:25:59.871891 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:25:59 crc kubenswrapper[4749]: E0128 19:25:59.872716 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.059555 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tmzfm"] Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060511 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="extract-utilities" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060526 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="extract-utilities" Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060547 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060553 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060567 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerName="extract-content" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060573 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerName="extract-content" Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060589 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="extract-utilities" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060595 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="extract-utilities" Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060608 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="extract-content" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060614 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="extract-content" Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060626 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="extract-content" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060633 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="extract-content" Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060644 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060650 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060666 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerName="extract-utilities" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060673 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerName="extract-utilities" Jan 28 19:26:08 crc kubenswrapper[4749]: E0128 19:26:08.060682 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060687 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060886 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9406d0-b104-4d14-acf0-b0a04349d828" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060900 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a207d291-61bc-4caf-becc-ddde78c4a044" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.060910 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e46ad3c-0635-4ec4-9928-dd29d7a3eaa5" containerName="registry-server" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.062475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.080694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmzfm"] Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.210843 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-catalog-content\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.211000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqxvt\" (UniqueName: \"kubernetes.io/projected/b7e794c5-0c7d-4459-8159-59ed3e56224d-kube-api-access-nqxvt\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.211061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-utilities\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.313231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-catalog-content\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.313402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqxvt\" (UniqueName: \"kubernetes.io/projected/b7e794c5-0c7d-4459-8159-59ed3e56224d-kube-api-access-nqxvt\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.313462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-utilities\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.314264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-utilities\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.314418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-catalog-content\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.347682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqxvt\" (UniqueName: \"kubernetes.io/projected/b7e794c5-0c7d-4459-8159-59ed3e56224d-kube-api-access-nqxvt\") pod \"community-operators-tmzfm\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.408398 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:08 crc kubenswrapper[4749]: I0128 19:26:08.989937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmzfm"] Jan 28 19:26:09 crc kubenswrapper[4749]: I0128 19:26:09.368965 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerID="6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3" exitCode=0 Jan 28 19:26:09 crc kubenswrapper[4749]: I0128 19:26:09.369084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmzfm" event={"ID":"b7e794c5-0c7d-4459-8159-59ed3e56224d","Type":"ContainerDied","Data":"6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3"} Jan 28 19:26:09 crc kubenswrapper[4749]: I0128 19:26:09.369294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmzfm" event={"ID":"b7e794c5-0c7d-4459-8159-59ed3e56224d","Type":"ContainerStarted","Data":"9210ffa27b134f84142c243744d77b6ee58f11ead3cbd8ec3905cb809fbd1d0d"} Jan 28 19:26:09 crc kubenswrapper[4749]: I0128 19:26:09.372460 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 19:26:11 crc kubenswrapper[4749]: I0128 19:26:11.390869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmzfm" event={"ID":"b7e794c5-0c7d-4459-8159-59ed3e56224d","Type":"ContainerStarted","Data":"6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db"} Jan 28 19:26:12 crc kubenswrapper[4749]: I0128 19:26:12.401651 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerID="6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db" exitCode=0 Jan 28 19:26:12 crc kubenswrapper[4749]: I0128 19:26:12.401719 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmzfm" event={"ID":"b7e794c5-0c7d-4459-8159-59ed3e56224d","Type":"ContainerDied","Data":"6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db"} Jan 28 19:26:13 crc kubenswrapper[4749]: I0128 19:26:13.412114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmzfm" event={"ID":"b7e794c5-0c7d-4459-8159-59ed3e56224d","Type":"ContainerStarted","Data":"68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014"} Jan 28 19:26:13 crc kubenswrapper[4749]: I0128 19:26:13.437856 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tmzfm" podStartSLOduration=1.98400519 podStartE2EDuration="5.437838908s" podCreationTimestamp="2026-01-28 19:26:08 +0000 UTC" firstStartedPulling="2026-01-28 19:26:09.372176561 +0000 UTC m=+3037.383703336" lastFinishedPulling="2026-01-28 19:26:12.826010279 +0000 UTC m=+3040.837537054" observedRunningTime="2026-01-28 19:26:13.427159114 +0000 UTC m=+3041.438685889" watchObservedRunningTime="2026-01-28 19:26:13.437838908 +0000 UTC m=+3041.449365683" Jan 28 19:26:14 crc kubenswrapper[4749]: I0128 19:26:14.871298 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:26:14 crc kubenswrapper[4749]: E0128 19:26:14.871926 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:26:18 crc kubenswrapper[4749]: I0128 19:26:18.409237 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:18 crc kubenswrapper[4749]: I0128 19:26:18.409825 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:18 crc kubenswrapper[4749]: I0128 19:26:18.479205 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:18 crc kubenswrapper[4749]: I0128 19:26:18.537499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:18 crc kubenswrapper[4749]: I0128 19:26:18.720390 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmzfm"] Jan 28 19:26:20 crc kubenswrapper[4749]: I0128 19:26:20.480657 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tmzfm" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerName="registry-server" containerID="cri-o://68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014" gracePeriod=2 Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.007923 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.130781 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-catalog-content\") pod \"b7e794c5-0c7d-4459-8159-59ed3e56224d\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.130847 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqxvt\" (UniqueName: \"kubernetes.io/projected/b7e794c5-0c7d-4459-8159-59ed3e56224d-kube-api-access-nqxvt\") pod \"b7e794c5-0c7d-4459-8159-59ed3e56224d\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.131224 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-utilities\") pod \"b7e794c5-0c7d-4459-8159-59ed3e56224d\" (UID: \"b7e794c5-0c7d-4459-8159-59ed3e56224d\") " Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.132182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-utilities" (OuterVolumeSpecName: "utilities") pod "b7e794c5-0c7d-4459-8159-59ed3e56224d" (UID: "b7e794c5-0c7d-4459-8159-59ed3e56224d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.136558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e794c5-0c7d-4459-8159-59ed3e56224d-kube-api-access-nqxvt" (OuterVolumeSpecName: "kube-api-access-nqxvt") pod "b7e794c5-0c7d-4459-8159-59ed3e56224d" (UID: "b7e794c5-0c7d-4459-8159-59ed3e56224d"). InnerVolumeSpecName "kube-api-access-nqxvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.202703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7e794c5-0c7d-4459-8159-59ed3e56224d" (UID: "b7e794c5-0c7d-4459-8159-59ed3e56224d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.233965 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.234001 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e794c5-0c7d-4459-8159-59ed3e56224d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.234011 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqxvt\" (UniqueName: \"kubernetes.io/projected/b7e794c5-0c7d-4459-8159-59ed3e56224d-kube-api-access-nqxvt\") on node \"crc\" DevicePath \"\"" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.491410 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerID="68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014" exitCode=0 Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.491466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmzfm" event={"ID":"b7e794c5-0c7d-4459-8159-59ed3e56224d","Type":"ContainerDied","Data":"68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014"} Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.491520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmzfm" event={"ID":"b7e794c5-0c7d-4459-8159-59ed3e56224d","Type":"ContainerDied","Data":"9210ffa27b134f84142c243744d77b6ee58f11ead3cbd8ec3905cb809fbd1d0d"} Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.491540 4749 scope.go:117] "RemoveContainer" containerID="68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.492667 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmzfm" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.512441 4749 scope.go:117] "RemoveContainer" containerID="6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.532907 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmzfm"] Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.545150 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tmzfm"] Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.551740 4749 scope.go:117] "RemoveContainer" containerID="6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.607727 4749 scope.go:117] "RemoveContainer" containerID="68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014" Jan 28 19:26:21 crc kubenswrapper[4749]: E0128 19:26:21.608274 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014\": container with ID starting with 68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014 not found: ID does not exist" containerID="68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.608320 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014"} err="failed to get container status \"68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014\": rpc error: code = NotFound desc = could not find container \"68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014\": container with ID starting with 68e26002a042999799e11c601ec4bec2fa1b31ea4c3242b525b12010d1da3014 not found: ID does not exist" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.608431 4749 scope.go:117] "RemoveContainer" containerID="6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db" Jan 28 19:26:21 crc kubenswrapper[4749]: E0128 19:26:21.608891 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db\": container with ID starting with 6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db not found: ID does not exist" containerID="6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.608935 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db"} err="failed to get container status \"6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db\": rpc error: code = NotFound desc = could not find container \"6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db\": container with ID starting with 6cbf4707406ce4c85360d84117a68d152a1de4f78cb8216da76ff32a42e004db not found: ID does not exist" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.608965 4749 scope.go:117] "RemoveContainer" containerID="6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3" Jan 28 19:26:21 crc kubenswrapper[4749]: E0128 19:26:21.610233 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3\": container with ID starting with 6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3 not found: ID does not exist" containerID="6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3" Jan 28 19:26:21 crc kubenswrapper[4749]: I0128 19:26:21.610259 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3"} err="failed to get container status \"6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3\": rpc error: code = NotFound desc = could not find container \"6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3\": container with ID starting with 6f74e2b57125c571bdba851c6c6800e79d9c497aab53569e08aa86db5836f1a3 not found: ID does not exist" Jan 28 19:26:22 crc kubenswrapper[4749]: I0128 19:26:22.884998 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" path="/var/lib/kubelet/pods/b7e794c5-0c7d-4459-8159-59ed3e56224d/volumes" Jan 28 19:26:29 crc kubenswrapper[4749]: I0128 19:26:29.872208 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:26:29 crc kubenswrapper[4749]: E0128 19:26:29.873062 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:26:41 crc kubenswrapper[4749]: I0128 19:26:41.872283 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:26:41 crc kubenswrapper[4749]: E0128 19:26:41.872988 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:26:56 crc kubenswrapper[4749]: I0128 19:26:56.872576 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:26:56 crc kubenswrapper[4749]: E0128 19:26:56.873527 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:27:08 crc kubenswrapper[4749]: I0128 19:27:08.871891 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:27:08 crc kubenswrapper[4749]: E0128 19:27:08.872783 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:27:22 crc kubenswrapper[4749]: I0128 19:27:22.879572 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:27:22 crc kubenswrapper[4749]: E0128 19:27:22.880285 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:27:36 crc kubenswrapper[4749]: I0128 19:27:36.872259 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:27:36 crc kubenswrapper[4749]: E0128 19:27:36.873193 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:27:50 crc kubenswrapper[4749]: I0128 19:27:50.872955 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:27:50 crc kubenswrapper[4749]: E0128 19:27:50.873989 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:28:04 crc kubenswrapper[4749]: I0128 19:28:04.871709 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:28:04 crc kubenswrapper[4749]: E0128 19:28:04.872595 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:28:17 crc kubenswrapper[4749]: I0128 19:28:17.871998 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:28:17 crc kubenswrapper[4749]: E0128 19:28:17.872823 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:28:30 crc kubenswrapper[4749]: I0128 19:28:30.872550 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:28:30 crc kubenswrapper[4749]: E0128 19:28:30.875452 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:28:44 crc kubenswrapper[4749]: I0128 19:28:44.872471 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:28:44 crc kubenswrapper[4749]: E0128 19:28:44.873428 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:28:57 crc kubenswrapper[4749]: I0128 19:28:57.871945 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:28:57 crc kubenswrapper[4749]: E0128 19:28:57.872772 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:29:09 crc kubenswrapper[4749]: I0128 19:29:09.870986 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:29:09 crc kubenswrapper[4749]: E0128 19:29:09.871900 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:29:23 crc kubenswrapper[4749]: I0128 19:29:23.871818 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:29:23 crc kubenswrapper[4749]: E0128 19:29:23.872763 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:29:36 crc kubenswrapper[4749]: I0128 19:29:36.872293 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:29:37 crc kubenswrapper[4749]: I0128 19:29:37.514882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"227b74715310cafdc77eb97ed7edba8f7d8d839bfbaaca135d58afe5151754d5"} Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.155073 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz"] Jan 28 19:30:00 crc kubenswrapper[4749]: E0128 19:30:00.155982 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerName="registry-server" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.155994 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerName="registry-server" Jan 28 19:30:00 crc kubenswrapper[4749]: E0128 19:30:00.156012 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerName="extract-utilities" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.156019 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerName="extract-utilities" Jan 28 19:30:00 crc kubenswrapper[4749]: E0128 19:30:00.156080 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerName="extract-content" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.156087 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerName="extract-content" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.156302 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e794c5-0c7d-4459-8159-59ed3e56224d" containerName="registry-server" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.157151 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.160574 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.161367 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.177370 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz"] Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.306579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e777c8-338c-41d9-aaaa-36f9962fc71c-config-volume\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.306776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e777c8-338c-41d9-aaaa-36f9962fc71c-secret-volume\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.306855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gc98\" (UniqueName: \"kubernetes.io/projected/11e777c8-338c-41d9-aaaa-36f9962fc71c-kube-api-access-8gc98\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.409595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e777c8-338c-41d9-aaaa-36f9962fc71c-config-volume\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.409696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e777c8-338c-41d9-aaaa-36f9962fc71c-secret-volume\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.409720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gc98\" (UniqueName: \"kubernetes.io/projected/11e777c8-338c-41d9-aaaa-36f9962fc71c-kube-api-access-8gc98\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.410774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e777c8-338c-41d9-aaaa-36f9962fc71c-config-volume\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.415142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e777c8-338c-41d9-aaaa-36f9962fc71c-secret-volume\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.425804 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gc98\" (UniqueName: \"kubernetes.io/projected/11e777c8-338c-41d9-aaaa-36f9962fc71c-kube-api-access-8gc98\") pod \"collect-profiles-29493810-gw2qz\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.479269 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:00 crc kubenswrapper[4749]: I0128 19:30:00.956735 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz"] Jan 28 19:30:00 crc kubenswrapper[4749]: W0128 19:30:00.960501 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e777c8_338c_41d9_aaaa_36f9962fc71c.slice/crio-db203a889b2927a0a4211817863a8f7671b316f661b16deadd4d7842fdbcea79 WatchSource:0}: Error finding container db203a889b2927a0a4211817863a8f7671b316f661b16deadd4d7842fdbcea79: Status 404 returned error can't find the container with id db203a889b2927a0a4211817863a8f7671b316f661b16deadd4d7842fdbcea79 Jan 28 19:30:01 crc kubenswrapper[4749]: I0128 19:30:01.759540 4749 generic.go:334] "Generic (PLEG): container finished" podID="11e777c8-338c-41d9-aaaa-36f9962fc71c" containerID="63b2ece3c3b4e5fa0c4dc54b8d1b5e1c6918d8ce80bc6f9360527882b5723634" exitCode=0 Jan 28 19:30:01 crc kubenswrapper[4749]: I0128 19:30:01.759694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" event={"ID":"11e777c8-338c-41d9-aaaa-36f9962fc71c","Type":"ContainerDied","Data":"63b2ece3c3b4e5fa0c4dc54b8d1b5e1c6918d8ce80bc6f9360527882b5723634"} Jan 28 19:30:01 crc kubenswrapper[4749]: I0128 19:30:01.760773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" event={"ID":"11e777c8-338c-41d9-aaaa-36f9962fc71c","Type":"ContainerStarted","Data":"db203a889b2927a0a4211817863a8f7671b316f661b16deadd4d7842fdbcea79"} Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.215776 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.387297 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gc98\" (UniqueName: \"kubernetes.io/projected/11e777c8-338c-41d9-aaaa-36f9962fc71c-kube-api-access-8gc98\") pod \"11e777c8-338c-41d9-aaaa-36f9962fc71c\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.387607 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e777c8-338c-41d9-aaaa-36f9962fc71c-secret-volume\") pod \"11e777c8-338c-41d9-aaaa-36f9962fc71c\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.387697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e777c8-338c-41d9-aaaa-36f9962fc71c-config-volume\") pod \"11e777c8-338c-41d9-aaaa-36f9962fc71c\" (UID: \"11e777c8-338c-41d9-aaaa-36f9962fc71c\") " Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.388792 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e777c8-338c-41d9-aaaa-36f9962fc71c-config-volume" (OuterVolumeSpecName: "config-volume") pod "11e777c8-338c-41d9-aaaa-36f9962fc71c" (UID: "11e777c8-338c-41d9-aaaa-36f9962fc71c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.393163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e777c8-338c-41d9-aaaa-36f9962fc71c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11e777c8-338c-41d9-aaaa-36f9962fc71c" (UID: "11e777c8-338c-41d9-aaaa-36f9962fc71c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.393875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e777c8-338c-41d9-aaaa-36f9962fc71c-kube-api-access-8gc98" (OuterVolumeSpecName: "kube-api-access-8gc98") pod "11e777c8-338c-41d9-aaaa-36f9962fc71c" (UID: "11e777c8-338c-41d9-aaaa-36f9962fc71c"). InnerVolumeSpecName "kube-api-access-8gc98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.490826 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gc98\" (UniqueName: \"kubernetes.io/projected/11e777c8-338c-41d9-aaaa-36f9962fc71c-kube-api-access-8gc98\") on node \"crc\" DevicePath \"\"" Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.490860 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e777c8-338c-41d9-aaaa-36f9962fc71c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.490871 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e777c8-338c-41d9-aaaa-36f9962fc71c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.790670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" event={"ID":"11e777c8-338c-41d9-aaaa-36f9962fc71c","Type":"ContainerDied","Data":"db203a889b2927a0a4211817863a8f7671b316f661b16deadd4d7842fdbcea79"} Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.790709 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db203a889b2927a0a4211817863a8f7671b316f661b16deadd4d7842fdbcea79" Jan 28 19:30:03 crc kubenswrapper[4749]: I0128 19:30:03.791086 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493810-gw2qz" Jan 28 19:30:04 crc kubenswrapper[4749]: I0128 19:30:04.297552 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg"] Jan 28 19:30:04 crc kubenswrapper[4749]: I0128 19:30:04.308058 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493765-swqbg"] Jan 28 19:30:04 crc kubenswrapper[4749]: I0128 19:30:04.965759 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8c2c33-582a-4b5f-965c-c23d2be58edf" path="/var/lib/kubelet/pods/af8c2c33-582a-4b5f-965c-c23d2be58edf/volumes" Jan 28 19:30:28 crc kubenswrapper[4749]: I0128 19:30:28.628546 4749 scope.go:117] "RemoveContainer" containerID="ca4fdaf4d36c301591672e5580061dd7ee2ccbd535b5ad03a7f69a40db58ed14" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.741822 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5tlbc"] Jan 28 19:31:21 crc kubenswrapper[4749]: E0128 19:31:21.743615 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e777c8-338c-41d9-aaaa-36f9962fc71c" containerName="collect-profiles" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.743644 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e777c8-338c-41d9-aaaa-36f9962fc71c" containerName="collect-profiles" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.744245 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e777c8-338c-41d9-aaaa-36f9962fc71c" containerName="collect-profiles" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.748160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.784060 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tlbc"] Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.878679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-catalog-content\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.879056 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr969\" (UniqueName: \"kubernetes.io/projected/adb932e5-b982-48dd-a6d8-7e7c533743bc-kube-api-access-wr969\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.879105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-utilities\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.981886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-catalog-content\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.982399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-catalog-content\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.982494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr969\" (UniqueName: \"kubernetes.io/projected/adb932e5-b982-48dd-a6d8-7e7c533743bc-kube-api-access-wr969\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.982828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-utilities\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:21 crc kubenswrapper[4749]: I0128 19:31:21.982877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-utilities\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:22 crc kubenswrapper[4749]: I0128 19:31:22.009167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr969\" (UniqueName: \"kubernetes.io/projected/adb932e5-b982-48dd-a6d8-7e7c533743bc-kube-api-access-wr969\") pod \"redhat-operators-5tlbc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:22 crc kubenswrapper[4749]: I0128 19:31:22.087720 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:22 crc kubenswrapper[4749]: I0128 19:31:22.469564 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tlbc"] Jan 28 19:31:22 crc kubenswrapper[4749]: I0128 19:31:22.582039 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlbc" event={"ID":"adb932e5-b982-48dd-a6d8-7e7c533743bc","Type":"ContainerStarted","Data":"926e07f52aa56726ab68592c638deb56604b59b21a1f2e8f745152cb94c313ac"} Jan 28 19:31:23 crc kubenswrapper[4749]: I0128 19:31:23.593057 4749 generic.go:334] "Generic (PLEG): container finished" podID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerID="85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b" exitCode=0 Jan 28 19:31:23 crc kubenswrapper[4749]: I0128 19:31:23.593108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlbc" event={"ID":"adb932e5-b982-48dd-a6d8-7e7c533743bc","Type":"ContainerDied","Data":"85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b"} Jan 28 19:31:23 crc kubenswrapper[4749]: I0128 19:31:23.595497 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 19:31:24 crc kubenswrapper[4749]: I0128 19:31:24.605547 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlbc" event={"ID":"adb932e5-b982-48dd-a6d8-7e7c533743bc","Type":"ContainerStarted","Data":"a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261"} Jan 28 19:31:25 crc kubenswrapper[4749]: I0128 19:31:25.616985 4749 generic.go:334] "Generic (PLEG): container finished" podID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerID="a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261" exitCode=0 Jan 28 19:31:25 crc kubenswrapper[4749]: I0128 19:31:25.617058 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlbc" event={"ID":"adb932e5-b982-48dd-a6d8-7e7c533743bc","Type":"ContainerDied","Data":"a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261"} Jan 28 19:31:28 crc kubenswrapper[4749]: I0128 19:31:28.652949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlbc" event={"ID":"adb932e5-b982-48dd-a6d8-7e7c533743bc","Type":"ContainerStarted","Data":"893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123"} Jan 28 19:31:29 crc kubenswrapper[4749]: I0128 19:31:29.690679 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5tlbc" podStartSLOduration=4.320239892 podStartE2EDuration="8.690655677s" podCreationTimestamp="2026-01-28 19:31:21 +0000 UTC" firstStartedPulling="2026-01-28 19:31:23.595266563 +0000 UTC m=+3351.606793338" lastFinishedPulling="2026-01-28 19:31:27.965682348 +0000 UTC m=+3355.977209123" observedRunningTime="2026-01-28 19:31:29.678265277 +0000 UTC m=+3357.689792082" watchObservedRunningTime="2026-01-28 19:31:29.690655677 +0000 UTC m=+3357.702182472" Jan 28 19:31:32 crc kubenswrapper[4749]: I0128 19:31:32.087940 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:32 crc kubenswrapper[4749]: I0128 19:31:32.088424 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:33 crc kubenswrapper[4749]: I0128 19:31:33.140693 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5tlbc" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="registry-server" probeResult="failure" output=< Jan 28 19:31:33 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:31:33 crc kubenswrapper[4749]: > Jan 28 19:31:42 crc kubenswrapper[4749]: I0128 19:31:42.133655 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:42 crc kubenswrapper[4749]: I0128 19:31:42.194861 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:42 crc kubenswrapper[4749]: I0128 19:31:42.374502 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tlbc"] Jan 28 19:31:43 crc kubenswrapper[4749]: I0128 19:31:43.799017 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5tlbc" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="registry-server" containerID="cri-o://893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123" gracePeriod=2 Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.325631 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.432307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-utilities\") pod \"adb932e5-b982-48dd-a6d8-7e7c533743bc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.432432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-catalog-content\") pod \"adb932e5-b982-48dd-a6d8-7e7c533743bc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.432483 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr969\" (UniqueName: \"kubernetes.io/projected/adb932e5-b982-48dd-a6d8-7e7c533743bc-kube-api-access-wr969\") pod \"adb932e5-b982-48dd-a6d8-7e7c533743bc\" (UID: \"adb932e5-b982-48dd-a6d8-7e7c533743bc\") " Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.433633 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-utilities" (OuterVolumeSpecName: "utilities") pod "adb932e5-b982-48dd-a6d8-7e7c533743bc" (UID: "adb932e5-b982-48dd-a6d8-7e7c533743bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.438455 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb932e5-b982-48dd-a6d8-7e7c533743bc-kube-api-access-wr969" (OuterVolumeSpecName: "kube-api-access-wr969") pod "adb932e5-b982-48dd-a6d8-7e7c533743bc" (UID: "adb932e5-b982-48dd-a6d8-7e7c533743bc"). InnerVolumeSpecName "kube-api-access-wr969". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.541037 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.541072 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr969\" (UniqueName: \"kubernetes.io/projected/adb932e5-b982-48dd-a6d8-7e7c533743bc-kube-api-access-wr969\") on node \"crc\" DevicePath \"\"" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.557473 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adb932e5-b982-48dd-a6d8-7e7c533743bc" (UID: "adb932e5-b982-48dd-a6d8-7e7c533743bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.643565 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb932e5-b982-48dd-a6d8-7e7c533743bc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.810814 4749 generic.go:334] "Generic (PLEG): container finished" podID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerID="893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123" exitCode=0 Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.810896 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tlbc" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.810962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlbc" event={"ID":"adb932e5-b982-48dd-a6d8-7e7c533743bc","Type":"ContainerDied","Data":"893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123"} Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.811318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlbc" event={"ID":"adb932e5-b982-48dd-a6d8-7e7c533743bc","Type":"ContainerDied","Data":"926e07f52aa56726ab68592c638deb56604b59b21a1f2e8f745152cb94c313ac"} Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.811356 4749 scope.go:117] "RemoveContainer" containerID="893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.848567 4749 scope.go:117] "RemoveContainer" containerID="a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.857289 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tlbc"] Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.869756 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5tlbc"] Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.891523 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" path="/var/lib/kubelet/pods/adb932e5-b982-48dd-a6d8-7e7c533743bc/volumes" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.895766 4749 scope.go:117] "RemoveContainer" containerID="85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.946208 4749 scope.go:117] "RemoveContainer" containerID="893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123" Jan 28 19:31:44 crc kubenswrapper[4749]: E0128 19:31:44.946821 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123\": container with ID starting with 893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123 not found: ID does not exist" containerID="893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.946864 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123"} err="failed to get container status \"893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123\": rpc error: code = NotFound desc = could not find container \"893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123\": container with ID starting with 893d406c5783776c40cee7779f24d4e1f200ac0ff656d9669a3de3d693c2c123 not found: ID does not exist" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.946887 4749 scope.go:117] "RemoveContainer" containerID="a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261" Jan 28 19:31:44 crc kubenswrapper[4749]: E0128 19:31:44.947108 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261\": container with ID starting with a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261 not found: ID does not exist" containerID="a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.947131 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261"} err="failed to get container status \"a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261\": rpc error: code = NotFound desc = could not find container \"a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261\": container with ID starting with a087dd8f60e38b184bdd49a18ef697e5f714b28ced066b1482c10e6ea19e7261 not found: ID does not exist" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.947147 4749 scope.go:117] "RemoveContainer" containerID="85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b" Jan 28 19:31:44 crc kubenswrapper[4749]: E0128 19:31:44.947436 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b\": container with ID starting with 85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b not found: ID does not exist" containerID="85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b" Jan 28 19:31:44 crc kubenswrapper[4749]: I0128 19:31:44.947485 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b"} err="failed to get container status \"85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b\": rpc error: code = NotFound desc = could not find container \"85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b\": container with ID starting with 85b663758fea86f1c0319bbdde90e5a166fe82fffcc5b6e7576556a574b6151b not found: ID does not exist" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.023185 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bmqzn"] Jan 28 19:31:55 crc kubenswrapper[4749]: E0128 19:31:55.024235 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="extract-utilities" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.024250 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="extract-utilities" Jan 28 19:31:55 crc kubenswrapper[4749]: E0128 19:31:55.024260 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="registry-server" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.024266 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="registry-server" Jan 28 19:31:55 crc kubenswrapper[4749]: E0128 19:31:55.024281 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="extract-content" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.024288 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="extract-content" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.024558 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb932e5-b982-48dd-a6d8-7e7c533743bc" containerName="registry-server" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.026320 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.045575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmqzn"] Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.078050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-catalog-content\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.078137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-utilities\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.078464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jphm7\" (UniqueName: \"kubernetes.io/projected/f5eb04d5-8198-4a27-9404-1116e36156b8-kube-api-access-jphm7\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.180891 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jphm7\" (UniqueName: \"kubernetes.io/projected/f5eb04d5-8198-4a27-9404-1116e36156b8-kube-api-access-jphm7\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.181046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-catalog-content\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.181105 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-utilities\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.181600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-catalog-content\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.181669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-utilities\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.202374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jphm7\" (UniqueName: \"kubernetes.io/projected/f5eb04d5-8198-4a27-9404-1116e36156b8-kube-api-access-jphm7\") pod \"certified-operators-bmqzn\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.368449 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.900689 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmqzn"] Jan 28 19:31:55 crc kubenswrapper[4749]: I0128 19:31:55.921518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmqzn" event={"ID":"f5eb04d5-8198-4a27-9404-1116e36156b8","Type":"ContainerStarted","Data":"9bfff693a198fbe83230dcc7f8d86b6b59dae1d509ab8e1bfd0f439baa879f75"} Jan 28 19:31:56 crc kubenswrapper[4749]: I0128 19:31:56.932984 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerID="023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925" exitCode=0 Jan 28 19:31:56 crc kubenswrapper[4749]: I0128 19:31:56.933107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmqzn" event={"ID":"f5eb04d5-8198-4a27-9404-1116e36156b8","Type":"ContainerDied","Data":"023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925"} Jan 28 19:31:57 crc kubenswrapper[4749]: I0128 19:31:57.467687 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:31:57 crc kubenswrapper[4749]: I0128 19:31:57.467985 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:31:58 crc kubenswrapper[4749]: I0128 19:31:58.954300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmqzn" event={"ID":"f5eb04d5-8198-4a27-9404-1116e36156b8","Type":"ContainerStarted","Data":"c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0"} Jan 28 19:32:00 crc kubenswrapper[4749]: I0128 19:32:00.976063 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerID="c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0" exitCode=0 Jan 28 19:32:00 crc kubenswrapper[4749]: I0128 19:32:00.976163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmqzn" event={"ID":"f5eb04d5-8198-4a27-9404-1116e36156b8","Type":"ContainerDied","Data":"c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0"} Jan 28 19:32:01 crc kubenswrapper[4749]: I0128 19:32:01.990028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmqzn" event={"ID":"f5eb04d5-8198-4a27-9404-1116e36156b8","Type":"ContainerStarted","Data":"29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88"} Jan 28 19:32:02 crc kubenswrapper[4749]: I0128 19:32:02.014303 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bmqzn" podStartSLOduration=3.588908917 podStartE2EDuration="8.014281937s" podCreationTimestamp="2026-01-28 19:31:54 +0000 UTC" firstStartedPulling="2026-01-28 19:31:56.945606322 +0000 UTC m=+3384.957133107" lastFinishedPulling="2026-01-28 19:32:01.370979352 +0000 UTC m=+3389.382506127" observedRunningTime="2026-01-28 19:32:02.010658604 +0000 UTC m=+3390.022185379" watchObservedRunningTime="2026-01-28 19:32:02.014281937 +0000 UTC m=+3390.025808712" Jan 28 19:32:05 crc kubenswrapper[4749]: I0128 19:32:05.369298 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:32:05 crc kubenswrapper[4749]: I0128 19:32:05.371001 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:32:05 crc kubenswrapper[4749]: I0128 19:32:05.425878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:32:06 crc kubenswrapper[4749]: I0128 19:32:06.085762 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:32:06 crc kubenswrapper[4749]: I0128 19:32:06.139549 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmqzn"] Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.050518 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bmqzn" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerName="registry-server" containerID="cri-o://29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88" gracePeriod=2 Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.575098 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.624582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-utilities\") pod \"f5eb04d5-8198-4a27-9404-1116e36156b8\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.624870 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jphm7\" (UniqueName: \"kubernetes.io/projected/f5eb04d5-8198-4a27-9404-1116e36156b8-kube-api-access-jphm7\") pod \"f5eb04d5-8198-4a27-9404-1116e36156b8\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.624967 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-catalog-content\") pod \"f5eb04d5-8198-4a27-9404-1116e36156b8\" (UID: \"f5eb04d5-8198-4a27-9404-1116e36156b8\") " Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.625599 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-utilities" (OuterVolumeSpecName: "utilities") pod "f5eb04d5-8198-4a27-9404-1116e36156b8" (UID: "f5eb04d5-8198-4a27-9404-1116e36156b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.625930 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.631673 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5eb04d5-8198-4a27-9404-1116e36156b8-kube-api-access-jphm7" (OuterVolumeSpecName: "kube-api-access-jphm7") pod "f5eb04d5-8198-4a27-9404-1116e36156b8" (UID: "f5eb04d5-8198-4a27-9404-1116e36156b8"). InnerVolumeSpecName "kube-api-access-jphm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.679149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5eb04d5-8198-4a27-9404-1116e36156b8" (UID: "f5eb04d5-8198-4a27-9404-1116e36156b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.728070 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jphm7\" (UniqueName: \"kubernetes.io/projected/f5eb04d5-8198-4a27-9404-1116e36156b8-kube-api-access-jphm7\") on node \"crc\" DevicePath \"\"" Jan 28 19:32:08 crc kubenswrapper[4749]: I0128 19:32:08.728117 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5eb04d5-8198-4a27-9404-1116e36156b8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.063040 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerID="29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88" exitCode=0 Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.063100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmqzn" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.063126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmqzn" event={"ID":"f5eb04d5-8198-4a27-9404-1116e36156b8","Type":"ContainerDied","Data":"29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88"} Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.063508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmqzn" event={"ID":"f5eb04d5-8198-4a27-9404-1116e36156b8","Type":"ContainerDied","Data":"9bfff693a198fbe83230dcc7f8d86b6b59dae1d509ab8e1bfd0f439baa879f75"} Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.063529 4749 scope.go:117] "RemoveContainer" containerID="29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.089455 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmqzn"] Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.097095 4749 scope.go:117] "RemoveContainer" containerID="c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.100239 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bmqzn"] Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.116457 4749 scope.go:117] "RemoveContainer" containerID="023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.168525 4749 scope.go:117] "RemoveContainer" containerID="29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88" Jan 28 19:32:09 crc kubenswrapper[4749]: E0128 19:32:09.169054 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88\": container with ID starting with 29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88 not found: ID does not exist" containerID="29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.169097 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88"} err="failed to get container status \"29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88\": rpc error: code = NotFound desc = could not find container \"29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88\": container with ID starting with 29bc11bc1059cc8ec4cde8c4de408e6d4068de25a80efdcf212954374f4b1a88 not found: ID does not exist" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.169125 4749 scope.go:117] "RemoveContainer" containerID="c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0" Jan 28 19:32:09 crc kubenswrapper[4749]: E0128 19:32:09.169607 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0\": container with ID starting with c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0 not found: ID does not exist" containerID="c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.169638 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0"} err="failed to get container status \"c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0\": rpc error: code = NotFound desc = could not find container \"c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0\": container with ID starting with c0f8f2158b19f4b8c7f1d70804a5f65ad14d108b43ba98f62620f7eb044c73e0 not found: ID does not exist" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.169658 4749 scope.go:117] "RemoveContainer" containerID="023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925" Jan 28 19:32:09 crc kubenswrapper[4749]: E0128 19:32:09.169950 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925\": container with ID starting with 023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925 not found: ID does not exist" containerID="023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925" Jan 28 19:32:09 crc kubenswrapper[4749]: I0128 19:32:09.169976 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925"} err="failed to get container status \"023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925\": rpc error: code = NotFound desc = could not find container \"023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925\": container with ID starting with 023e1241f31160d1e9370c1c1b77af5785cd5fa2b2f7fb12763c6260be501925 not found: ID does not exist" Jan 28 19:32:10 crc kubenswrapper[4749]: I0128 19:32:10.893974 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" path="/var/lib/kubelet/pods/f5eb04d5-8198-4a27-9404-1116e36156b8/volumes" Jan 28 19:32:27 crc kubenswrapper[4749]: I0128 19:32:27.467324 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:32:27 crc kubenswrapper[4749]: I0128 19:32:27.467970 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.809966 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xgg6h"] Jan 28 19:32:28 crc kubenswrapper[4749]: E0128 19:32:28.810672 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerName="extract-content" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.810698 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerName="extract-content" Jan 28 19:32:28 crc kubenswrapper[4749]: E0128 19:32:28.810727 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerName="registry-server" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.810732 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerName="registry-server" Jan 28 19:32:28 crc kubenswrapper[4749]: E0128 19:32:28.810753 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerName="extract-utilities" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.810761 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerName="extract-utilities" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.810970 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5eb04d5-8198-4a27-9404-1116e36156b8" containerName="registry-server" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.812637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.834990 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgg6h"] Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.918351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7cgc\" (UniqueName: \"kubernetes.io/projected/8568a6d0-6545-4579-b6bb-6b455a3abc93-kube-api-access-c7cgc\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.918682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-catalog-content\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:28 crc kubenswrapper[4749]: I0128 19:32:28.918738 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-utilities\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:29 crc kubenswrapper[4749]: I0128 19:32:29.021380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7cgc\" (UniqueName: \"kubernetes.io/projected/8568a6d0-6545-4579-b6bb-6b455a3abc93-kube-api-access-c7cgc\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:29 crc kubenswrapper[4749]: I0128 19:32:29.021460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-catalog-content\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:29 crc kubenswrapper[4749]: I0128 19:32:29.021529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-utilities\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:29 crc kubenswrapper[4749]: I0128 19:32:29.022985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-catalog-content\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:29 crc kubenswrapper[4749]: I0128 19:32:29.023108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-utilities\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:29 crc kubenswrapper[4749]: I0128 19:32:29.048234 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7cgc\" (UniqueName: \"kubernetes.io/projected/8568a6d0-6545-4579-b6bb-6b455a3abc93-kube-api-access-c7cgc\") pod \"redhat-marketplace-xgg6h\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:29 crc kubenswrapper[4749]: I0128 19:32:29.197915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:29 crc kubenswrapper[4749]: I0128 19:32:29.704645 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgg6h"] Jan 28 19:32:30 crc kubenswrapper[4749]: I0128 19:32:30.297076 4749 generic.go:334] "Generic (PLEG): container finished" podID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerID="48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7" exitCode=0 Jan 28 19:32:30 crc kubenswrapper[4749]: I0128 19:32:30.297132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgg6h" event={"ID":"8568a6d0-6545-4579-b6bb-6b455a3abc93","Type":"ContainerDied","Data":"48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7"} Jan 28 19:32:30 crc kubenswrapper[4749]: I0128 19:32:30.297404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgg6h" event={"ID":"8568a6d0-6545-4579-b6bb-6b455a3abc93","Type":"ContainerStarted","Data":"aa66bbca9b566674bd49bf5d72216ae0db11c999b9285ecd007b89b8de05a5f5"} Jan 28 19:32:31 crc kubenswrapper[4749]: I0128 19:32:31.328908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgg6h" event={"ID":"8568a6d0-6545-4579-b6bb-6b455a3abc93","Type":"ContainerStarted","Data":"ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e"} Jan 28 19:32:32 crc kubenswrapper[4749]: I0128 19:32:32.346039 4749 generic.go:334] "Generic (PLEG): container finished" podID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerID="ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e" exitCode=0 Jan 28 19:32:32 crc kubenswrapper[4749]: I0128 19:32:32.346129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgg6h" event={"ID":"8568a6d0-6545-4579-b6bb-6b455a3abc93","Type":"ContainerDied","Data":"ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e"} Jan 28 19:32:33 crc kubenswrapper[4749]: I0128 19:32:33.357393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgg6h" event={"ID":"8568a6d0-6545-4579-b6bb-6b455a3abc93","Type":"ContainerStarted","Data":"94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a"} Jan 28 19:32:33 crc kubenswrapper[4749]: I0128 19:32:33.378966 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xgg6h" podStartSLOduration=2.8993896550000002 podStartE2EDuration="5.378945938s" podCreationTimestamp="2026-01-28 19:32:28 +0000 UTC" firstStartedPulling="2026-01-28 19:32:30.298921988 +0000 UTC m=+3418.310448773" lastFinishedPulling="2026-01-28 19:32:32.778478281 +0000 UTC m=+3420.790005056" observedRunningTime="2026-01-28 19:32:33.373154317 +0000 UTC m=+3421.384681112" watchObservedRunningTime="2026-01-28 19:32:33.378945938 +0000 UTC m=+3421.390472713" Jan 28 19:32:39 crc kubenswrapper[4749]: I0128 19:32:39.198892 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:39 crc kubenswrapper[4749]: I0128 19:32:39.199463 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:39 crc kubenswrapper[4749]: I0128 19:32:39.269585 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:39 crc kubenswrapper[4749]: I0128 19:32:39.459980 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:39 crc kubenswrapper[4749]: I0128 19:32:39.517975 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgg6h"] Jan 28 19:32:41 crc kubenswrapper[4749]: I0128 19:32:41.439506 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xgg6h" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerName="registry-server" containerID="cri-o://94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a" gracePeriod=2 Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.014541 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.138038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7cgc\" (UniqueName: \"kubernetes.io/projected/8568a6d0-6545-4579-b6bb-6b455a3abc93-kube-api-access-c7cgc\") pod \"8568a6d0-6545-4579-b6bb-6b455a3abc93\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.138222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-utilities\") pod \"8568a6d0-6545-4579-b6bb-6b455a3abc93\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.138493 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-catalog-content\") pod \"8568a6d0-6545-4579-b6bb-6b455a3abc93\" (UID: \"8568a6d0-6545-4579-b6bb-6b455a3abc93\") " Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.139204 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-utilities" (OuterVolumeSpecName: "utilities") pod "8568a6d0-6545-4579-b6bb-6b455a3abc93" (UID: "8568a6d0-6545-4579-b6bb-6b455a3abc93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.140094 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.144510 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8568a6d0-6545-4579-b6bb-6b455a3abc93-kube-api-access-c7cgc" (OuterVolumeSpecName: "kube-api-access-c7cgc") pod "8568a6d0-6545-4579-b6bb-6b455a3abc93" (UID: "8568a6d0-6545-4579-b6bb-6b455a3abc93"). InnerVolumeSpecName "kube-api-access-c7cgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.161541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8568a6d0-6545-4579-b6bb-6b455a3abc93" (UID: "8568a6d0-6545-4579-b6bb-6b455a3abc93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.241922 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568a6d0-6545-4579-b6bb-6b455a3abc93-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.241955 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7cgc\" (UniqueName: \"kubernetes.io/projected/8568a6d0-6545-4579-b6bb-6b455a3abc93-kube-api-access-c7cgc\") on node \"crc\" DevicePath \"\"" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.450979 4749 generic.go:334] "Generic (PLEG): container finished" podID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerID="94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a" exitCode=0 Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.451031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgg6h" event={"ID":"8568a6d0-6545-4579-b6bb-6b455a3abc93","Type":"ContainerDied","Data":"94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a"} Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.451060 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgg6h" event={"ID":"8568a6d0-6545-4579-b6bb-6b455a3abc93","Type":"ContainerDied","Data":"aa66bbca9b566674bd49bf5d72216ae0db11c999b9285ecd007b89b8de05a5f5"} Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.451069 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgg6h" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.451081 4749 scope.go:117] "RemoveContainer" containerID="94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.474216 4749 scope.go:117] "RemoveContainer" containerID="ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.494751 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgg6h"] Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.503945 4749 scope.go:117] "RemoveContainer" containerID="48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.515741 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgg6h"] Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.555909 4749 scope.go:117] "RemoveContainer" containerID="94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a" Jan 28 19:32:42 crc kubenswrapper[4749]: E0128 19:32:42.556361 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a\": container with ID starting with 94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a not found: ID does not exist" containerID="94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.556395 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a"} err="failed to get container status \"94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a\": rpc error: code = NotFound desc = could not find container \"94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a\": container with ID starting with 94767b0fe6167c425b39f7445d0eaea1a2ad66c88d90a4c26e7618bccd7a046a not found: ID does not exist" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.556417 4749 scope.go:117] "RemoveContainer" containerID="ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e" Jan 28 19:32:42 crc kubenswrapper[4749]: E0128 19:32:42.557028 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e\": container with ID starting with ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e not found: ID does not exist" containerID="ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.557106 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e"} err="failed to get container status \"ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e\": rpc error: code = NotFound desc = could not find container \"ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e\": container with ID starting with ed931d1f22fbc0f84850763f237784a1b322bfbcac6a613aa0616e40c10c688e not found: ID does not exist" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.557134 4749 scope.go:117] "RemoveContainer" containerID="48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7" Jan 28 19:32:42 crc kubenswrapper[4749]: E0128 19:32:42.557416 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7\": container with ID starting with 48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7 not found: ID does not exist" containerID="48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.557445 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7"} err="failed to get container status \"48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7\": rpc error: code = NotFound desc = could not find container \"48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7\": container with ID starting with 48bb622d9df9253846d823ba4afd49f0409a736b8d35d013afd9d68592911da7 not found: ID does not exist" Jan 28 19:32:42 crc kubenswrapper[4749]: I0128 19:32:42.916803 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" path="/var/lib/kubelet/pods/8568a6d0-6545-4579-b6bb-6b455a3abc93/volumes" Jan 28 19:32:57 crc kubenswrapper[4749]: I0128 19:32:57.467924 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:32:57 crc kubenswrapper[4749]: I0128 19:32:57.468508 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:32:57 crc kubenswrapper[4749]: I0128 19:32:57.468557 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:32:57 crc kubenswrapper[4749]: I0128 19:32:57.469177 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"227b74715310cafdc77eb97ed7edba8f7d8d839bfbaaca135d58afe5151754d5"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:32:57 crc kubenswrapper[4749]: I0128 19:32:57.469245 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://227b74715310cafdc77eb97ed7edba8f7d8d839bfbaaca135d58afe5151754d5" gracePeriod=600 Jan 28 19:32:58 crc kubenswrapper[4749]: I0128 19:32:58.600464 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="227b74715310cafdc77eb97ed7edba8f7d8d839bfbaaca135d58afe5151754d5" exitCode=0 Jan 28 19:32:58 crc kubenswrapper[4749]: I0128 19:32:58.600557 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"227b74715310cafdc77eb97ed7edba8f7d8d839bfbaaca135d58afe5151754d5"} Jan 28 19:32:58 crc kubenswrapper[4749]: I0128 19:32:58.601043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e"} Jan 28 19:32:58 crc kubenswrapper[4749]: I0128 19:32:58.601070 4749 scope.go:117] "RemoveContainer" containerID="12fcb2618b87aa1b7f70de8525a8cc3ff8ac49fccbd3ea14d36d551d410c784e" Jan 28 19:35:27 crc kubenswrapper[4749]: I0128 19:35:27.467012 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:35:27 crc kubenswrapper[4749]: I0128 19:35:27.467672 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:35:57 crc kubenswrapper[4749]: I0128 19:35:57.467530 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:35:57 crc kubenswrapper[4749]: I0128 19:35:57.468141 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.467490 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.468015 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.468062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.468965 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.469022 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" gracePeriod=600 Jan 28 19:36:27 crc kubenswrapper[4749]: E0128 19:36:27.589063 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.657253 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" exitCode=0 Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.657774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e"} Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.657811 4749 scope.go:117] "RemoveContainer" containerID="227b74715310cafdc77eb97ed7edba8f7d8d839bfbaaca135d58afe5151754d5" Jan 28 19:36:27 crc kubenswrapper[4749]: I0128 19:36:27.658773 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:36:27 crc kubenswrapper[4749]: E0128 19:36:27.659115 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:36:41 crc kubenswrapper[4749]: I0128 19:36:41.872090 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:36:41 crc kubenswrapper[4749]: E0128 19:36:41.872916 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:36:53 crc kubenswrapper[4749]: I0128 19:36:53.872464 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:36:53 crc kubenswrapper[4749]: E0128 19:36:53.873592 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:37:04 crc kubenswrapper[4749]: I0128 19:37:04.872392 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:37:04 crc kubenswrapper[4749]: E0128 19:37:04.873177 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:37:16 crc kubenswrapper[4749]: I0128 19:37:16.872222 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:37:16 crc kubenswrapper[4749]: E0128 19:37:16.874666 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:37:31 crc kubenswrapper[4749]: I0128 19:37:31.872075 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:37:31 crc kubenswrapper[4749]: E0128 19:37:31.873594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:37:46 crc kubenswrapper[4749]: I0128 19:37:46.872567 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:37:46 crc kubenswrapper[4749]: E0128 19:37:46.873379 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:37:57 crc kubenswrapper[4749]: I0128 19:37:57.872168 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:37:57 crc kubenswrapper[4749]: E0128 19:37:57.872917 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:38:09 crc kubenswrapper[4749]: I0128 19:38:09.871785 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:38:09 crc kubenswrapper[4749]: E0128 19:38:09.872700 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:38:24 crc kubenswrapper[4749]: I0128 19:38:24.872103 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:38:24 crc kubenswrapper[4749]: E0128 19:38:24.873183 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:38:36 crc kubenswrapper[4749]: I0128 19:38:36.871829 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:38:36 crc kubenswrapper[4749]: E0128 19:38:36.872759 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:38:50 crc kubenswrapper[4749]: I0128 19:38:50.872424 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:38:50 crc kubenswrapper[4749]: E0128 19:38:50.873117 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:38:55 crc kubenswrapper[4749]: I0128 19:38:55.303952 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gzsjh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": context deadline exceeded" start-of-body= Jan 28 19:38:55 crc kubenswrapper[4749]: I0128 19:38:55.304709 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" podUID="8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": context deadline exceeded" Jan 28 19:38:55 crc kubenswrapper[4749]: I0128 19:38:55.514857 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gzsjh container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": context deadline exceeded" start-of-body= Jan 28 19:38:55 crc kubenswrapper[4749]: I0128 19:38:55.514921 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gzsjh" podUID="8c405b8c-eafd-4f26-9c72-0c73e5ec4e4c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": context deadline exceeded" Jan 28 19:39:04 crc kubenswrapper[4749]: I0128 19:39:04.871827 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:39:04 crc kubenswrapper[4749]: E0128 19:39:04.872614 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:39:18 crc kubenswrapper[4749]: I0128 19:39:18.871692 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:39:18 crc kubenswrapper[4749]: E0128 19:39:18.872645 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:39:30 crc kubenswrapper[4749]: I0128 19:39:30.871903 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:39:30 crc kubenswrapper[4749]: E0128 19:39:30.872702 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:39:43 crc kubenswrapper[4749]: I0128 19:39:43.872576 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:39:43 crc kubenswrapper[4749]: E0128 19:39:43.873772 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:39:57 crc kubenswrapper[4749]: I0128 19:39:57.871687 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:39:57 crc kubenswrapper[4749]: E0128 19:39:57.872370 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:40:10 crc kubenswrapper[4749]: I0128 19:40:10.871947 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:40:10 crc kubenswrapper[4749]: E0128 19:40:10.872945 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:40:24 crc kubenswrapper[4749]: I0128 19:40:24.872047 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:40:24 crc kubenswrapper[4749]: E0128 19:40:24.872847 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:40:39 crc kubenswrapper[4749]: I0128 19:40:39.871850 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:40:39 crc kubenswrapper[4749]: E0128 19:40:39.874668 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:40:50 crc kubenswrapper[4749]: I0128 19:40:50.872265 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:40:50 crc kubenswrapper[4749]: E0128 19:40:50.873571 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:41:05 crc kubenswrapper[4749]: I0128 19:41:05.873472 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:41:05 crc kubenswrapper[4749]: E0128 19:41:05.874199 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:41:18 crc kubenswrapper[4749]: I0128 19:41:18.872236 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:41:18 crc kubenswrapper[4749]: E0128 19:41:18.873155 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:41:33 crc kubenswrapper[4749]: I0128 19:41:33.871509 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:41:34 crc kubenswrapper[4749]: I0128 19:41:34.732201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"530d9e678bb30d21ac739a3cd8ac93cda8903db94744f64cdabeb20d1079d8fe"} Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.483092 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hctkl"] Jan 28 19:41:48 crc kubenswrapper[4749]: E0128 19:41:48.485416 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerName="registry-server" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.485546 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerName="registry-server" Jan 28 19:41:48 crc kubenswrapper[4749]: E0128 19:41:48.485644 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerName="extract-utilities" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.485769 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerName="extract-utilities" Jan 28 19:41:48 crc kubenswrapper[4749]: E0128 19:41:48.485907 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerName="extract-content" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.485978 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerName="extract-content" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.486304 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8568a6d0-6545-4579-b6bb-6b455a3abc93" containerName="registry-server" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.488653 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.526391 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hctkl"] Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.569667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-utilities\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.569784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxv7f\" (UniqueName: \"kubernetes.io/projected/07f974b5-b455-4b0b-a661-d6901549b001-kube-api-access-mxv7f\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.569884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-catalog-content\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.673035 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-utilities\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.673113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxv7f\" (UniqueName: \"kubernetes.io/projected/07f974b5-b455-4b0b-a661-d6901549b001-kube-api-access-mxv7f\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.673241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-catalog-content\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.673686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-utilities\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.674029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-catalog-content\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.715842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxv7f\" (UniqueName: \"kubernetes.io/projected/07f974b5-b455-4b0b-a661-d6901549b001-kube-api-access-mxv7f\") pod \"community-operators-hctkl\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:48 crc kubenswrapper[4749]: I0128 19:41:48.816500 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:49 crc kubenswrapper[4749]: W0128 19:41:49.479849 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07f974b5_b455_4b0b_a661_d6901549b001.slice/crio-ae4664dbeb8b08ab5e35c329f6ba70516e6ffed83916c910a9af2cf74c3ccb63 WatchSource:0}: Error finding container ae4664dbeb8b08ab5e35c329f6ba70516e6ffed83916c910a9af2cf74c3ccb63: Status 404 returned error can't find the container with id ae4664dbeb8b08ab5e35c329f6ba70516e6ffed83916c910a9af2cf74c3ccb63 Jan 28 19:41:49 crc kubenswrapper[4749]: I0128 19:41:49.480925 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hctkl"] Jan 28 19:41:49 crc kubenswrapper[4749]: I0128 19:41:49.889789 4749 generic.go:334] "Generic (PLEG): container finished" podID="07f974b5-b455-4b0b-a661-d6901549b001" containerID="9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da" exitCode=0 Jan 28 19:41:49 crc kubenswrapper[4749]: I0128 19:41:49.890121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctkl" event={"ID":"07f974b5-b455-4b0b-a661-d6901549b001","Type":"ContainerDied","Data":"9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da"} Jan 28 19:41:49 crc kubenswrapper[4749]: I0128 19:41:49.890183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctkl" event={"ID":"07f974b5-b455-4b0b-a661-d6901549b001","Type":"ContainerStarted","Data":"ae4664dbeb8b08ab5e35c329f6ba70516e6ffed83916c910a9af2cf74c3ccb63"} Jan 28 19:41:49 crc kubenswrapper[4749]: I0128 19:41:49.896975 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 19:41:49 crc kubenswrapper[4749]: I0128 19:41:49.897185 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4wqpm"] Jan 28 19:41:49 crc kubenswrapper[4749]: I0128 19:41:49.900855 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:49 crc kubenswrapper[4749]: I0128 19:41:49.931033 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4wqpm"] Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.026755 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-catalog-content\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.026907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-utilities\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.027107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k62dc\" (UniqueName: \"kubernetes.io/projected/a8faacb5-5007-4303-9372-ef5247be41af-kube-api-access-k62dc\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.129482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-catalog-content\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.129570 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-utilities\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.129650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k62dc\" (UniqueName: \"kubernetes.io/projected/a8faacb5-5007-4303-9372-ef5247be41af-kube-api-access-k62dc\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.130068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-catalog-content\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.130177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-utilities\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.161039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k62dc\" (UniqueName: \"kubernetes.io/projected/a8faacb5-5007-4303-9372-ef5247be41af-kube-api-access-k62dc\") pod \"redhat-operators-4wqpm\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.224974 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.759891 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4wqpm"] Jan 28 19:41:50 crc kubenswrapper[4749]: I0128 19:41:50.918479 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wqpm" event={"ID":"a8faacb5-5007-4303-9372-ef5247be41af","Type":"ContainerStarted","Data":"bd64e293f63c0d58483eeff569fa6235304b1b6ccfff4136376805ceefb5b1c7"} Jan 28 19:41:51 crc kubenswrapper[4749]: I0128 19:41:51.930143 4749 generic.go:334] "Generic (PLEG): container finished" podID="a8faacb5-5007-4303-9372-ef5247be41af" containerID="85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c" exitCode=0 Jan 28 19:41:51 crc kubenswrapper[4749]: I0128 19:41:51.930468 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wqpm" event={"ID":"a8faacb5-5007-4303-9372-ef5247be41af","Type":"ContainerDied","Data":"85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c"} Jan 28 19:41:51 crc kubenswrapper[4749]: I0128 19:41:51.936923 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctkl" event={"ID":"07f974b5-b455-4b0b-a661-d6901549b001","Type":"ContainerStarted","Data":"28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0"} Jan 28 19:41:53 crc kubenswrapper[4749]: I0128 19:41:53.958701 4749 generic.go:334] "Generic (PLEG): container finished" podID="07f974b5-b455-4b0b-a661-d6901549b001" containerID="28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0" exitCode=0 Jan 28 19:41:53 crc kubenswrapper[4749]: I0128 19:41:53.958746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctkl" event={"ID":"07f974b5-b455-4b0b-a661-d6901549b001","Type":"ContainerDied","Data":"28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0"} Jan 28 19:41:53 crc kubenswrapper[4749]: I0128 19:41:53.962942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wqpm" event={"ID":"a8faacb5-5007-4303-9372-ef5247be41af","Type":"ContainerStarted","Data":"5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0"} Jan 28 19:41:55 crc kubenswrapper[4749]: I0128 19:41:55.996280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctkl" event={"ID":"07f974b5-b455-4b0b-a661-d6901549b001","Type":"ContainerStarted","Data":"049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b"} Jan 28 19:41:56 crc kubenswrapper[4749]: I0128 19:41:56.016681 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hctkl" podStartSLOduration=3.533977563 podStartE2EDuration="8.016652304s" podCreationTimestamp="2026-01-28 19:41:48 +0000 UTC" firstStartedPulling="2026-01-28 19:41:49.896720464 +0000 UTC m=+3977.908247239" lastFinishedPulling="2026-01-28 19:41:54.379395205 +0000 UTC m=+3982.390921980" observedRunningTime="2026-01-28 19:41:56.015933316 +0000 UTC m=+3984.027460101" watchObservedRunningTime="2026-01-28 19:41:56.016652304 +0000 UTC m=+3984.028179079" Jan 28 19:41:58 crc kubenswrapper[4749]: I0128 19:41:58.816831 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:58 crc kubenswrapper[4749]: I0128 19:41:58.817315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:41:59 crc kubenswrapper[4749]: I0128 19:41:59.026440 4749 generic.go:334] "Generic (PLEG): container finished" podID="a8faacb5-5007-4303-9372-ef5247be41af" containerID="5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0" exitCode=0 Jan 28 19:41:59 crc kubenswrapper[4749]: I0128 19:41:59.026521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wqpm" event={"ID":"a8faacb5-5007-4303-9372-ef5247be41af","Type":"ContainerDied","Data":"5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0"} Jan 28 19:41:59 crc kubenswrapper[4749]: I0128 19:41:59.869785 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hctkl" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="registry-server" probeResult="failure" output=< Jan 28 19:41:59 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:41:59 crc kubenswrapper[4749]: > Jan 28 19:42:00 crc kubenswrapper[4749]: I0128 19:42:00.040690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wqpm" event={"ID":"a8faacb5-5007-4303-9372-ef5247be41af","Type":"ContainerStarted","Data":"3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e"} Jan 28 19:42:00 crc kubenswrapper[4749]: I0128 19:42:00.081939 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4wqpm" podStartSLOduration=3.531889961 podStartE2EDuration="11.081918131s" podCreationTimestamp="2026-01-28 19:41:49 +0000 UTC" firstStartedPulling="2026-01-28 19:41:51.938463974 +0000 UTC m=+3979.949990749" lastFinishedPulling="2026-01-28 19:41:59.488492154 +0000 UTC m=+3987.500018919" observedRunningTime="2026-01-28 19:42:00.073662831 +0000 UTC m=+3988.085189616" watchObservedRunningTime="2026-01-28 19:42:00.081918131 +0000 UTC m=+3988.093444906" Jan 28 19:42:00 crc kubenswrapper[4749]: I0128 19:42:00.225589 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:42:00 crc kubenswrapper[4749]: I0128 19:42:00.226090 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:42:01 crc kubenswrapper[4749]: I0128 19:42:01.273010 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4wqpm" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="registry-server" probeResult="failure" output=< Jan 28 19:42:01 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:42:01 crc kubenswrapper[4749]: > Jan 28 19:42:08 crc kubenswrapper[4749]: I0128 19:42:08.868919 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:42:08 crc kubenswrapper[4749]: I0128 19:42:08.925188 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:42:09 crc kubenswrapper[4749]: I0128 19:42:09.122146 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hctkl"] Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.160882 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hctkl" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="registry-server" containerID="cri-o://049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b" gracePeriod=2 Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.723395 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.852062 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxv7f\" (UniqueName: \"kubernetes.io/projected/07f974b5-b455-4b0b-a661-d6901549b001-kube-api-access-mxv7f\") pod \"07f974b5-b455-4b0b-a661-d6901549b001\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.852436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-catalog-content\") pod \"07f974b5-b455-4b0b-a661-d6901549b001\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.852897 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-utilities\") pod \"07f974b5-b455-4b0b-a661-d6901549b001\" (UID: \"07f974b5-b455-4b0b-a661-d6901549b001\") " Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.853364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-utilities" (OuterVolumeSpecName: "utilities") pod "07f974b5-b455-4b0b-a661-d6901549b001" (UID: "07f974b5-b455-4b0b-a661-d6901549b001"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.854176 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.862576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f974b5-b455-4b0b-a661-d6901549b001-kube-api-access-mxv7f" (OuterVolumeSpecName: "kube-api-access-mxv7f") pod "07f974b5-b455-4b0b-a661-d6901549b001" (UID: "07f974b5-b455-4b0b-a661-d6901549b001"). InnerVolumeSpecName "kube-api-access-mxv7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.933570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07f974b5-b455-4b0b-a661-d6901549b001" (UID: "07f974b5-b455-4b0b-a661-d6901549b001"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.956048 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxv7f\" (UniqueName: \"kubernetes.io/projected/07f974b5-b455-4b0b-a661-d6901549b001-kube-api-access-mxv7f\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:10 crc kubenswrapper[4749]: I0128 19:42:10.956092 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07f974b5-b455-4b0b-a661-d6901549b001-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.171588 4749 generic.go:334] "Generic (PLEG): container finished" podID="07f974b5-b455-4b0b-a661-d6901549b001" containerID="049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b" exitCode=0 Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.171646 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctkl" event={"ID":"07f974b5-b455-4b0b-a661-d6901549b001","Type":"ContainerDied","Data":"049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b"} Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.171718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctkl" event={"ID":"07f974b5-b455-4b0b-a661-d6901549b001","Type":"ContainerDied","Data":"ae4664dbeb8b08ab5e35c329f6ba70516e6ffed83916c910a9af2cf74c3ccb63"} Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.171744 4749 scope.go:117] "RemoveContainer" containerID="049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.171669 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hctkl" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.209383 4749 scope.go:117] "RemoveContainer" containerID="28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.231033 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hctkl"] Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.243836 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hctkl"] Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.256898 4749 scope.go:117] "RemoveContainer" containerID="9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.283621 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4wqpm" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="registry-server" probeResult="failure" output=< Jan 28 19:42:11 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:42:11 crc kubenswrapper[4749]: > Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.308538 4749 scope.go:117] "RemoveContainer" containerID="049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b" Jan 28 19:42:11 crc kubenswrapper[4749]: E0128 19:42:11.309007 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b\": container with ID starting with 049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b not found: ID does not exist" containerID="049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.309044 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b"} err="failed to get container status \"049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b\": rpc error: code = NotFound desc = could not find container \"049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b\": container with ID starting with 049d926292c221dc48bfa7dc02d3f6d57b214c7d428a060ce3f0ce9283c9769b not found: ID does not exist" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.309086 4749 scope.go:117] "RemoveContainer" containerID="28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0" Jan 28 19:42:11 crc kubenswrapper[4749]: E0128 19:42:11.309486 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0\": container with ID starting with 28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0 not found: ID does not exist" containerID="28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.309508 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0"} err="failed to get container status \"28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0\": rpc error: code = NotFound desc = could not find container \"28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0\": container with ID starting with 28792d491cd5e4ceef8a1c31ef7d924a56c0888dcb3ddec65fb33462930459a0 not found: ID does not exist" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.309538 4749 scope.go:117] "RemoveContainer" containerID="9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da" Jan 28 19:42:11 crc kubenswrapper[4749]: E0128 19:42:11.309876 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da\": container with ID starting with 9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da not found: ID does not exist" containerID="9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da" Jan 28 19:42:11 crc kubenswrapper[4749]: I0128 19:42:11.309912 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da"} err="failed to get container status \"9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da\": rpc error: code = NotFound desc = could not find container \"9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da\": container with ID starting with 9935ce19aa5af52d53d7eac84afc4f7dd1b4005a4f8459fa71021bb972f071da not found: ID does not exist" Jan 28 19:42:12 crc kubenswrapper[4749]: I0128 19:42:12.883926 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f974b5-b455-4b0b-a661-d6901549b001" path="/var/lib/kubelet/pods/07f974b5-b455-4b0b-a661-d6901549b001/volumes" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.316979 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lvsvm"] Jan 28 19:42:16 crc kubenswrapper[4749]: E0128 19:42:16.318252 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="extract-utilities" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.318274 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="extract-utilities" Jan 28 19:42:16 crc kubenswrapper[4749]: E0128 19:42:16.318299 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="registry-server" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.318308 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="registry-server" Jan 28 19:42:16 crc kubenswrapper[4749]: E0128 19:42:16.318370 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="extract-content" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.318382 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="extract-content" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.318696 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f974b5-b455-4b0b-a661-d6901549b001" containerName="registry-server" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.321401 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.327918 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvsvm"] Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.515705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-catalog-content\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.516968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-utilities\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.517104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r72b2\" (UniqueName: \"kubernetes.io/projected/c2377982-fb92-4a47-9302-e7384e9b2de1-kube-api-access-r72b2\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.619170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-utilities\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.619247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r72b2\" (UniqueName: \"kubernetes.io/projected/c2377982-fb92-4a47-9302-e7384e9b2de1-kube-api-access-r72b2\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.619415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-catalog-content\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.619932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-utilities\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.620071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-catalog-content\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.639550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r72b2\" (UniqueName: \"kubernetes.io/projected/c2377982-fb92-4a47-9302-e7384e9b2de1-kube-api-access-r72b2\") pod \"certified-operators-lvsvm\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:16 crc kubenswrapper[4749]: I0128 19:42:16.649522 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:17 crc kubenswrapper[4749]: I0128 19:42:17.184663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvsvm"] Jan 28 19:42:18 crc kubenswrapper[4749]: I0128 19:42:18.242484 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerID="83ca85bc9397ed4a73c978f7dcff2f6109f6b37963ff97be53ab14766545cb2a" exitCode=0 Jan 28 19:42:18 crc kubenswrapper[4749]: I0128 19:42:18.242545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvsvm" event={"ID":"c2377982-fb92-4a47-9302-e7384e9b2de1","Type":"ContainerDied","Data":"83ca85bc9397ed4a73c978f7dcff2f6109f6b37963ff97be53ab14766545cb2a"} Jan 28 19:42:18 crc kubenswrapper[4749]: I0128 19:42:18.242959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvsvm" event={"ID":"c2377982-fb92-4a47-9302-e7384e9b2de1","Type":"ContainerStarted","Data":"56962b3fb6def5f21b292c08c4a675f5988cb39d330af86cfa40c98b16e8fa89"} Jan 28 19:42:19 crc kubenswrapper[4749]: I0128 19:42:19.266878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvsvm" event={"ID":"c2377982-fb92-4a47-9302-e7384e9b2de1","Type":"ContainerStarted","Data":"6a1da2f628c8dee9cd8425858637cd5bffc39cedca1d165f0f4534bd7e972d12"} Jan 28 19:42:21 crc kubenswrapper[4749]: I0128 19:42:21.271084 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4wqpm" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="registry-server" probeResult="failure" output=< Jan 28 19:42:21 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:42:21 crc kubenswrapper[4749]: > Jan 28 19:42:22 crc kubenswrapper[4749]: I0128 19:42:22.296421 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerID="6a1da2f628c8dee9cd8425858637cd5bffc39cedca1d165f0f4534bd7e972d12" exitCode=0 Jan 28 19:42:22 crc kubenswrapper[4749]: I0128 19:42:22.296517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvsvm" event={"ID":"c2377982-fb92-4a47-9302-e7384e9b2de1","Type":"ContainerDied","Data":"6a1da2f628c8dee9cd8425858637cd5bffc39cedca1d165f0f4534bd7e972d12"} Jan 28 19:42:23 crc kubenswrapper[4749]: I0128 19:42:23.311607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvsvm" event={"ID":"c2377982-fb92-4a47-9302-e7384e9b2de1","Type":"ContainerStarted","Data":"212fa1ae576aada9594789508c810a5bb852c4d2aa79f51ca54d3295aeb7f5bc"} Jan 28 19:42:23 crc kubenswrapper[4749]: I0128 19:42:23.333909 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lvsvm" podStartSLOduration=2.83345546 podStartE2EDuration="7.333890842s" podCreationTimestamp="2026-01-28 19:42:16 +0000 UTC" firstStartedPulling="2026-01-28 19:42:18.244595185 +0000 UTC m=+4006.256121960" lastFinishedPulling="2026-01-28 19:42:22.745030567 +0000 UTC m=+4010.756557342" observedRunningTime="2026-01-28 19:42:23.33090705 +0000 UTC m=+4011.342433835" watchObservedRunningTime="2026-01-28 19:42:23.333890842 +0000 UTC m=+4011.345417617" Jan 28 19:42:26 crc kubenswrapper[4749]: I0128 19:42:26.650231 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:26 crc kubenswrapper[4749]: I0128 19:42:26.650814 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:27 crc kubenswrapper[4749]: I0128 19:42:27.727847 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-lvsvm" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="registry-server" probeResult="failure" output=< Jan 28 19:42:27 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:42:27 crc kubenswrapper[4749]: > Jan 28 19:42:30 crc kubenswrapper[4749]: I0128 19:42:30.282054 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:42:30 crc kubenswrapper[4749]: I0128 19:42:30.333386 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:42:30 crc kubenswrapper[4749]: I0128 19:42:30.517243 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4wqpm"] Jan 28 19:42:31 crc kubenswrapper[4749]: I0128 19:42:31.389287 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4wqpm" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="registry-server" containerID="cri-o://3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e" gracePeriod=2 Jan 28 19:42:31 crc kubenswrapper[4749]: I0128 19:42:31.835954 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:42:31 crc kubenswrapper[4749]: I0128 19:42:31.987355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-utilities\") pod \"a8faacb5-5007-4303-9372-ef5247be41af\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " Jan 28 19:42:31 crc kubenswrapper[4749]: I0128 19:42:31.987491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-catalog-content\") pod \"a8faacb5-5007-4303-9372-ef5247be41af\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " Jan 28 19:42:31 crc kubenswrapper[4749]: I0128 19:42:31.987740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k62dc\" (UniqueName: \"kubernetes.io/projected/a8faacb5-5007-4303-9372-ef5247be41af-kube-api-access-k62dc\") pod \"a8faacb5-5007-4303-9372-ef5247be41af\" (UID: \"a8faacb5-5007-4303-9372-ef5247be41af\") " Jan 28 19:42:31 crc kubenswrapper[4749]: I0128 19:42:31.988565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-utilities" (OuterVolumeSpecName: "utilities") pod "a8faacb5-5007-4303-9372-ef5247be41af" (UID: "a8faacb5-5007-4303-9372-ef5247be41af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:42:31 crc kubenswrapper[4749]: I0128 19:42:31.994923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8faacb5-5007-4303-9372-ef5247be41af-kube-api-access-k62dc" (OuterVolumeSpecName: "kube-api-access-k62dc") pod "a8faacb5-5007-4303-9372-ef5247be41af" (UID: "a8faacb5-5007-4303-9372-ef5247be41af"). InnerVolumeSpecName "kube-api-access-k62dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.090824 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.090880 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k62dc\" (UniqueName: \"kubernetes.io/projected/a8faacb5-5007-4303-9372-ef5247be41af-kube-api-access-k62dc\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.127177 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8faacb5-5007-4303-9372-ef5247be41af" (UID: "a8faacb5-5007-4303-9372-ef5247be41af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.196304 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8faacb5-5007-4303-9372-ef5247be41af-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.403958 4749 generic.go:334] "Generic (PLEG): container finished" podID="a8faacb5-5007-4303-9372-ef5247be41af" containerID="3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e" exitCode=0 Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.404012 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wqpm" event={"ID":"a8faacb5-5007-4303-9372-ef5247be41af","Type":"ContainerDied","Data":"3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e"} Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.404044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wqpm" event={"ID":"a8faacb5-5007-4303-9372-ef5247be41af","Type":"ContainerDied","Data":"bd64e293f63c0d58483eeff569fa6235304b1b6ccfff4136376805ceefb5b1c7"} Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.404067 4749 scope.go:117] "RemoveContainer" containerID="3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.404261 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wqpm" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.433481 4749 scope.go:117] "RemoveContainer" containerID="5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.444201 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4wqpm"] Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.455864 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4wqpm"] Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.473127 4749 scope.go:117] "RemoveContainer" containerID="85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.520640 4749 scope.go:117] "RemoveContainer" containerID="3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e" Jan 28 19:42:32 crc kubenswrapper[4749]: E0128 19:42:32.521079 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e\": container with ID starting with 3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e not found: ID does not exist" containerID="3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.521133 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e"} err="failed to get container status \"3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e\": rpc error: code = NotFound desc = could not find container \"3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e\": container with ID starting with 3f01f79a19f8bcd7d0ee8d5e83c00b48a074a5da36f6ef9ed46249f2170cba8e not found: ID does not exist" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.521168 4749 scope.go:117] "RemoveContainer" containerID="5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0" Jan 28 19:42:32 crc kubenswrapper[4749]: E0128 19:42:32.521622 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0\": container with ID starting with 5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0 not found: ID does not exist" containerID="5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.521674 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0"} err="failed to get container status \"5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0\": rpc error: code = NotFound desc = could not find container \"5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0\": container with ID starting with 5628f12e55905d0796b911548875bc1b2da998450da3111ed9b7f40ec1fb37e0 not found: ID does not exist" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.521702 4749 scope.go:117] "RemoveContainer" containerID="85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c" Jan 28 19:42:32 crc kubenswrapper[4749]: E0128 19:42:32.522067 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c\": container with ID starting with 85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c not found: ID does not exist" containerID="85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.522106 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c"} err="failed to get container status \"85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c\": rpc error: code = NotFound desc = could not find container \"85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c\": container with ID starting with 85d4a37125c3325501e384b4056d646cae5ba843d14d439fad8c81d8f01bcb7c not found: ID does not exist" Jan 28 19:42:32 crc kubenswrapper[4749]: I0128 19:42:32.885395 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8faacb5-5007-4303-9372-ef5247be41af" path="/var/lib/kubelet/pods/a8faacb5-5007-4303-9372-ef5247be41af/volumes" Jan 28 19:42:36 crc kubenswrapper[4749]: I0128 19:42:36.717655 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:36 crc kubenswrapper[4749]: I0128 19:42:36.778174 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:36 crc kubenswrapper[4749]: I0128 19:42:36.967591 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvsvm"] Jan 28 19:42:38 crc kubenswrapper[4749]: I0128 19:42:38.466905 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lvsvm" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="registry-server" containerID="cri-o://212fa1ae576aada9594789508c810a5bb852c4d2aa79f51ca54d3295aeb7f5bc" gracePeriod=2 Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.481551 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerID="212fa1ae576aada9594789508c810a5bb852c4d2aa79f51ca54d3295aeb7f5bc" exitCode=0 Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.481641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvsvm" event={"ID":"c2377982-fb92-4a47-9302-e7384e9b2de1","Type":"ContainerDied","Data":"212fa1ae576aada9594789508c810a5bb852c4d2aa79f51ca54d3295aeb7f5bc"} Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.648150 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.819362 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r72b2\" (UniqueName: \"kubernetes.io/projected/c2377982-fb92-4a47-9302-e7384e9b2de1-kube-api-access-r72b2\") pod \"c2377982-fb92-4a47-9302-e7384e9b2de1\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.819613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-utilities\") pod \"c2377982-fb92-4a47-9302-e7384e9b2de1\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.819702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-catalog-content\") pod \"c2377982-fb92-4a47-9302-e7384e9b2de1\" (UID: \"c2377982-fb92-4a47-9302-e7384e9b2de1\") " Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.820461 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-utilities" (OuterVolumeSpecName: "utilities") pod "c2377982-fb92-4a47-9302-e7384e9b2de1" (UID: "c2377982-fb92-4a47-9302-e7384e9b2de1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.823063 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.824856 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2377982-fb92-4a47-9302-e7384e9b2de1-kube-api-access-r72b2" (OuterVolumeSpecName: "kube-api-access-r72b2") pod "c2377982-fb92-4a47-9302-e7384e9b2de1" (UID: "c2377982-fb92-4a47-9302-e7384e9b2de1"). InnerVolumeSpecName "kube-api-access-r72b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.869097 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2377982-fb92-4a47-9302-e7384e9b2de1" (UID: "c2377982-fb92-4a47-9302-e7384e9b2de1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.926105 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r72b2\" (UniqueName: \"kubernetes.io/projected/c2377982-fb92-4a47-9302-e7384e9b2de1-kube-api-access-r72b2\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:39 crc kubenswrapper[4749]: I0128 19:42:39.926169 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2377982-fb92-4a47-9302-e7384e9b2de1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:42:40 crc kubenswrapper[4749]: I0128 19:42:40.493428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvsvm" event={"ID":"c2377982-fb92-4a47-9302-e7384e9b2de1","Type":"ContainerDied","Data":"56962b3fb6def5f21b292c08c4a675f5988cb39d330af86cfa40c98b16e8fa89"} Jan 28 19:42:40 crc kubenswrapper[4749]: I0128 19:42:40.493509 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvsvm" Jan 28 19:42:40 crc kubenswrapper[4749]: I0128 19:42:40.493762 4749 scope.go:117] "RemoveContainer" containerID="212fa1ae576aada9594789508c810a5bb852c4d2aa79f51ca54d3295aeb7f5bc" Jan 28 19:42:40 crc kubenswrapper[4749]: I0128 19:42:40.514499 4749 scope.go:117] "RemoveContainer" containerID="6a1da2f628c8dee9cd8425858637cd5bffc39cedca1d165f0f4534bd7e972d12" Jan 28 19:42:40 crc kubenswrapper[4749]: I0128 19:42:40.537841 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvsvm"] Jan 28 19:42:40 crc kubenswrapper[4749]: I0128 19:42:40.553437 4749 scope.go:117] "RemoveContainer" containerID="83ca85bc9397ed4a73c978f7dcff2f6109f6b37963ff97be53ab14766545cb2a" Jan 28 19:42:40 crc kubenswrapper[4749]: I0128 19:42:40.555165 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lvsvm"] Jan 28 19:42:40 crc kubenswrapper[4749]: I0128 19:42:40.884291 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" path="/var/lib/kubelet/pods/c2377982-fb92-4a47-9302-e7384e9b2de1/volumes" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.635013 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qfswt"] Jan 28 19:43:47 crc kubenswrapper[4749]: E0128 19:43:47.635902 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="registry-server" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.635917 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="registry-server" Jan 28 19:43:47 crc kubenswrapper[4749]: E0128 19:43:47.635932 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="registry-server" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.635938 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="registry-server" Jan 28 19:43:47 crc kubenswrapper[4749]: E0128 19:43:47.635954 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="extract-content" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.635959 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="extract-content" Jan 28 19:43:47 crc kubenswrapper[4749]: E0128 19:43:47.635983 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="extract-utilities" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.635989 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="extract-utilities" Jan 28 19:43:47 crc kubenswrapper[4749]: E0128 19:43:47.636004 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="extract-utilities" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.636009 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="extract-utilities" Jan 28 19:43:47 crc kubenswrapper[4749]: E0128 19:43:47.636032 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="extract-content" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.636038 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="extract-content" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.636231 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2377982-fb92-4a47-9302-e7384e9b2de1" containerName="registry-server" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.636247 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8faacb5-5007-4303-9372-ef5247be41af" containerName="registry-server" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.637826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.651515 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfswt"] Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.737347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v56rd\" (UniqueName: \"kubernetes.io/projected/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-kube-api-access-v56rd\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.737423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-catalog-content\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.737458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-utilities\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.840262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v56rd\" (UniqueName: \"kubernetes.io/projected/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-kube-api-access-v56rd\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.840344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-catalog-content\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.840390 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-utilities\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.840888 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-catalog-content\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.841069 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-utilities\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.870302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v56rd\" (UniqueName: \"kubernetes.io/projected/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-kube-api-access-v56rd\") pod \"redhat-marketplace-qfswt\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:47 crc kubenswrapper[4749]: I0128 19:43:47.964641 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:48 crc kubenswrapper[4749]: I0128 19:43:48.499050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfswt"] Jan 28 19:43:49 crc kubenswrapper[4749]: I0128 19:43:49.202235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfswt" event={"ID":"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618","Type":"ContainerStarted","Data":"380a819922fd306e03ad198e86a1978338a54de43529fdc17eeb274dc34c41c9"} Jan 28 19:43:52 crc kubenswrapper[4749]: I0128 19:43:52.231583 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerID="1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2" exitCode=0 Jan 28 19:43:52 crc kubenswrapper[4749]: I0128 19:43:52.231695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfswt" event={"ID":"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618","Type":"ContainerDied","Data":"1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2"} Jan 28 19:43:53 crc kubenswrapper[4749]: I0128 19:43:53.245433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfswt" event={"ID":"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618","Type":"ContainerStarted","Data":"d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b"} Jan 28 19:43:54 crc kubenswrapper[4749]: I0128 19:43:54.258125 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerID="d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b" exitCode=0 Jan 28 19:43:54 crc kubenswrapper[4749]: I0128 19:43:54.258189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfswt" event={"ID":"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618","Type":"ContainerDied","Data":"d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b"} Jan 28 19:43:55 crc kubenswrapper[4749]: I0128 19:43:55.272550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfswt" event={"ID":"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618","Type":"ContainerStarted","Data":"295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f"} Jan 28 19:43:55 crc kubenswrapper[4749]: I0128 19:43:55.296926 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qfswt" podStartSLOduration=5.833868824 podStartE2EDuration="8.296896805s" podCreationTimestamp="2026-01-28 19:43:47 +0000 UTC" firstStartedPulling="2026-01-28 19:43:52.234506475 +0000 UTC m=+4100.246033260" lastFinishedPulling="2026-01-28 19:43:54.697534466 +0000 UTC m=+4102.709061241" observedRunningTime="2026-01-28 19:43:55.294242939 +0000 UTC m=+4103.305769714" watchObservedRunningTime="2026-01-28 19:43:55.296896805 +0000 UTC m=+4103.308423590" Jan 28 19:43:57 crc kubenswrapper[4749]: I0128 19:43:57.469267 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:43:57 crc kubenswrapper[4749]: I0128 19:43:57.471036 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:43:57 crc kubenswrapper[4749]: I0128 19:43:57.965466 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:57 crc kubenswrapper[4749]: I0128 19:43:57.965845 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:58 crc kubenswrapper[4749]: I0128 19:43:58.019027 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:59 crc kubenswrapper[4749]: I0128 19:43:59.352546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:43:59 crc kubenswrapper[4749]: I0128 19:43:59.402040 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfswt"] Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.320604 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qfswt" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerName="registry-server" containerID="cri-o://295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f" gracePeriod=2 Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.776443 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.878720 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-utilities\") pod \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.878809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v56rd\" (UniqueName: \"kubernetes.io/projected/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-kube-api-access-v56rd\") pod \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.878984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-catalog-content\") pod \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\" (UID: \"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618\") " Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.879509 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-utilities" (OuterVolumeSpecName: "utilities") pod "ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" (UID: "ff19f1ea-3dcc-479b-86e1-df5b0e8cd618"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.879609 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.885578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-kube-api-access-v56rd" (OuterVolumeSpecName: "kube-api-access-v56rd") pod "ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" (UID: "ff19f1ea-3dcc-479b-86e1-df5b0e8cd618"). InnerVolumeSpecName "kube-api-access-v56rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.910759 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" (UID: "ff19f1ea-3dcc-479b-86e1-df5b0e8cd618"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.983138 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v56rd\" (UniqueName: \"kubernetes.io/projected/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-kube-api-access-v56rd\") on node \"crc\" DevicePath \"\"" Jan 28 19:44:01 crc kubenswrapper[4749]: I0128 19:44:01.983179 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.332149 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerID="295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f" exitCode=0 Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.332202 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfswt" event={"ID":"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618","Type":"ContainerDied","Data":"295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f"} Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.332258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfswt" event={"ID":"ff19f1ea-3dcc-479b-86e1-df5b0e8cd618","Type":"ContainerDied","Data":"380a819922fd306e03ad198e86a1978338a54de43529fdc17eeb274dc34c41c9"} Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.332279 4749 scope.go:117] "RemoveContainer" containerID="295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.332217 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qfswt" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.358857 4749 scope.go:117] "RemoveContainer" containerID="d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.376720 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfswt"] Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.389810 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfswt"] Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.392684 4749 scope.go:117] "RemoveContainer" containerID="1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.445286 4749 scope.go:117] "RemoveContainer" containerID="295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f" Jan 28 19:44:02 crc kubenswrapper[4749]: E0128 19:44:02.455977 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f\": container with ID starting with 295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f not found: ID does not exist" containerID="295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.456031 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f"} err="failed to get container status \"295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f\": rpc error: code = NotFound desc = could not find container \"295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f\": container with ID starting with 295efd9a90d74489a2e7b5a88d1bcce1474af7c3ef36af909c623b9fd6a7a59f not found: ID does not exist" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.456058 4749 scope.go:117] "RemoveContainer" containerID="d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b" Jan 28 19:44:02 crc kubenswrapper[4749]: E0128 19:44:02.457179 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b\": container with ID starting with d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b not found: ID does not exist" containerID="d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.457237 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b"} err="failed to get container status \"d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b\": rpc error: code = NotFound desc = could not find container \"d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b\": container with ID starting with d823249a7853f54cf3b4f0a63713385038c8d017dab2ddc370747259a574553b not found: ID does not exist" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.457267 4749 scope.go:117] "RemoveContainer" containerID="1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2" Jan 28 19:44:02 crc kubenswrapper[4749]: E0128 19:44:02.457911 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2\": container with ID starting with 1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2 not found: ID does not exist" containerID="1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.457937 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2"} err="failed to get container status \"1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2\": rpc error: code = NotFound desc = could not find container \"1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2\": container with ID starting with 1d0217e9f18a5e999d828036f5e83d1e67180d9b98ee8dd12daacf6540ecc1d2 not found: ID does not exist" Jan 28 19:44:02 crc kubenswrapper[4749]: I0128 19:44:02.883195 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" path="/var/lib/kubelet/pods/ff19f1ea-3dcc-479b-86e1-df5b0e8cd618/volumes" Jan 28 19:44:27 crc kubenswrapper[4749]: I0128 19:44:27.467743 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:44:27 crc kubenswrapper[4749]: I0128 19:44:27.468273 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.467774 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.468640 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.468711 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.470091 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"530d9e678bb30d21ac739a3cd8ac93cda8903db94744f64cdabeb20d1079d8fe"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.470161 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://530d9e678bb30d21ac739a3cd8ac93cda8903db94744f64cdabeb20d1079d8fe" gracePeriod=600 Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.869027 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="530d9e678bb30d21ac739a3cd8ac93cda8903db94744f64cdabeb20d1079d8fe" exitCode=0 Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.869102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"530d9e678bb30d21ac739a3cd8ac93cda8903db94744f64cdabeb20d1079d8fe"} Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.869416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d"} Jan 28 19:44:57 crc kubenswrapper[4749]: I0128 19:44:57.869441 4749 scope.go:117] "RemoveContainer" containerID="0aa7f225c9892e536932f9a59fe0edbeac93530aeca72ce380a60aceeb148f7e" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.182009 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7"] Jan 28 19:45:00 crc kubenswrapper[4749]: E0128 19:45:00.183099 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerName="extract-content" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.183112 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerName="extract-content" Jan 28 19:45:00 crc kubenswrapper[4749]: E0128 19:45:00.183131 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerName="extract-utilities" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.183137 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerName="extract-utilities" Jan 28 19:45:00 crc kubenswrapper[4749]: E0128 19:45:00.183167 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerName="registry-server" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.183174 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerName="registry-server" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.183422 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff19f1ea-3dcc-479b-86e1-df5b0e8cd618" containerName="registry-server" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.184293 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.186370 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.186688 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.194582 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7"] Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.229179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-secret-volume\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.229419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td5w8\" (UniqueName: \"kubernetes.io/projected/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-kube-api-access-td5w8\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.230000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-config-volume\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.333410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-config-volume\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.333815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-secret-volume\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.334009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td5w8\" (UniqueName: \"kubernetes.io/projected/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-kube-api-access-td5w8\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.334657 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-config-volume\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.340948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-secret-volume\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.351924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td5w8\" (UniqueName: \"kubernetes.io/projected/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-kube-api-access-td5w8\") pod \"collect-profiles-29493825-wfph7\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.513199 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:00 crc kubenswrapper[4749]: I0128 19:45:00.996716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7"] Jan 28 19:45:01 crc kubenswrapper[4749]: W0128 19:45:01.005452 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35e931e7_ccd2_46c4_b506_9d1a9d8a813c.slice/crio-7edfa51dbe6a983f9f5f75db25d073d078fdbe401944fc00ea0fd20668a39252 WatchSource:0}: Error finding container 7edfa51dbe6a983f9f5f75db25d073d078fdbe401944fc00ea0fd20668a39252: Status 404 returned error can't find the container with id 7edfa51dbe6a983f9f5f75db25d073d078fdbe401944fc00ea0fd20668a39252 Jan 28 19:45:01 crc kubenswrapper[4749]: I0128 19:45:01.913259 4749 generic.go:334] "Generic (PLEG): container finished" podID="35e931e7-ccd2-46c4-b506-9d1a9d8a813c" containerID="08db8940e276d446237ee698cd2719243dbfaa0964c5edd5860a81cd6c9a58b3" exitCode=0 Jan 28 19:45:01 crc kubenswrapper[4749]: I0128 19:45:01.913307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" event={"ID":"35e931e7-ccd2-46c4-b506-9d1a9d8a813c","Type":"ContainerDied","Data":"08db8940e276d446237ee698cd2719243dbfaa0964c5edd5860a81cd6c9a58b3"} Jan 28 19:45:01 crc kubenswrapper[4749]: I0128 19:45:01.913613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" event={"ID":"35e931e7-ccd2-46c4-b506-9d1a9d8a813c","Type":"ContainerStarted","Data":"7edfa51dbe6a983f9f5f75db25d073d078fdbe401944fc00ea0fd20668a39252"} Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.088798 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.127648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td5w8\" (UniqueName: \"kubernetes.io/projected/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-kube-api-access-td5w8\") pod \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.127732 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-secret-volume\") pod \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.128018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-config-volume\") pod \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\" (UID: \"35e931e7-ccd2-46c4-b506-9d1a9d8a813c\") " Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.129555 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-config-volume" (OuterVolumeSpecName: "config-volume") pod "35e931e7-ccd2-46c4-b506-9d1a9d8a813c" (UID: "35e931e7-ccd2-46c4-b506-9d1a9d8a813c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.136758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35e931e7-ccd2-46c4-b506-9d1a9d8a813c" (UID: "35e931e7-ccd2-46c4-b506-9d1a9d8a813c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.139708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-kube-api-access-td5w8" (OuterVolumeSpecName: "kube-api-access-td5w8") pod "35e931e7-ccd2-46c4-b506-9d1a9d8a813c" (UID: "35e931e7-ccd2-46c4-b506-9d1a9d8a813c"). InnerVolumeSpecName "kube-api-access-td5w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.230611 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.230653 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td5w8\" (UniqueName: \"kubernetes.io/projected/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-kube-api-access-td5w8\") on node \"crc\" DevicePath \"\"" Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.230670 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35e931e7-ccd2-46c4-b506-9d1a9d8a813c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.944750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" event={"ID":"35e931e7-ccd2-46c4-b506-9d1a9d8a813c","Type":"ContainerDied","Data":"7edfa51dbe6a983f9f5f75db25d073d078fdbe401944fc00ea0fd20668a39252"} Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.945097 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7edfa51dbe6a983f9f5f75db25d073d078fdbe401944fc00ea0fd20668a39252" Jan 28 19:45:04 crc kubenswrapper[4749]: I0128 19:45:04.944796 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493825-wfph7" Jan 28 19:45:05 crc kubenswrapper[4749]: I0128 19:45:05.164322 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8"] Jan 28 19:45:05 crc kubenswrapper[4749]: I0128 19:45:05.177012 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493780-277f8"] Jan 28 19:45:06 crc kubenswrapper[4749]: I0128 19:45:06.889486 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6666ce12-edde-4ccc-ad28-deb0f7c7ae25" path="/var/lib/kubelet/pods/6666ce12-edde-4ccc-ad28-deb0f7c7ae25/volumes" Jan 28 19:45:29 crc kubenswrapper[4749]: I0128 19:45:29.122830 4749 scope.go:117] "RemoveContainer" containerID="606c259c4aac7dfaf23dd9be63d3771fa6817f3a4ba28d749d36e7d7fe7465ca" Jan 28 19:46:57 crc kubenswrapper[4749]: I0128 19:46:57.470864 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:46:57 crc kubenswrapper[4749]: I0128 19:46:57.471498 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:47:27 crc kubenswrapper[4749]: I0128 19:47:27.467482 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:47:27 crc kubenswrapper[4749]: I0128 19:47:27.468033 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:47:57 crc kubenswrapper[4749]: I0128 19:47:57.467835 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:47:57 crc kubenswrapper[4749]: I0128 19:47:57.468576 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:47:57 crc kubenswrapper[4749]: I0128 19:47:57.468647 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:47:57 crc kubenswrapper[4749]: I0128 19:47:57.469840 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:47:57 crc kubenswrapper[4749]: I0128 19:47:57.469910 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" gracePeriod=600 Jan 28 19:47:57 crc kubenswrapper[4749]: E0128 19:47:57.599949 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:47:58 crc kubenswrapper[4749]: I0128 19:47:58.140459 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" exitCode=0 Jan 28 19:47:58 crc kubenswrapper[4749]: I0128 19:47:58.140870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d"} Jan 28 19:47:58 crc kubenswrapper[4749]: I0128 19:47:58.140905 4749 scope.go:117] "RemoveContainer" containerID="530d9e678bb30d21ac739a3cd8ac93cda8903db94744f64cdabeb20d1079d8fe" Jan 28 19:47:58 crc kubenswrapper[4749]: I0128 19:47:58.141994 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:47:58 crc kubenswrapper[4749]: E0128 19:47:58.142403 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:48:11 crc kubenswrapper[4749]: I0128 19:48:11.873917 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:48:11 crc kubenswrapper[4749]: E0128 19:48:11.875125 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:48:24 crc kubenswrapper[4749]: I0128 19:48:24.872766 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:48:24 crc kubenswrapper[4749]: E0128 19:48:24.873701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:48:37 crc kubenswrapper[4749]: I0128 19:48:37.872225 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:48:37 crc kubenswrapper[4749]: E0128 19:48:37.873141 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:48:48 crc kubenswrapper[4749]: I0128 19:48:48.872130 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:48:48 crc kubenswrapper[4749]: E0128 19:48:48.874174 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:49:02 crc kubenswrapper[4749]: I0128 19:49:02.881520 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:49:02 crc kubenswrapper[4749]: E0128 19:49:02.883316 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:49:17 crc kubenswrapper[4749]: I0128 19:49:17.877050 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:49:17 crc kubenswrapper[4749]: E0128 19:49:17.877917 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:49:29 crc kubenswrapper[4749]: I0128 19:49:29.871791 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:49:29 crc kubenswrapper[4749]: E0128 19:49:29.872706 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:49:41 crc kubenswrapper[4749]: I0128 19:49:41.871578 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:49:41 crc kubenswrapper[4749]: E0128 19:49:41.872388 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:49:54 crc kubenswrapper[4749]: I0128 19:49:54.871764 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:49:54 crc kubenswrapper[4749]: E0128 19:49:54.872567 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:50:07 crc kubenswrapper[4749]: I0128 19:50:07.871774 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:50:07 crc kubenswrapper[4749]: E0128 19:50:07.872686 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:50:21 crc kubenswrapper[4749]: I0128 19:50:21.872743 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:50:21 crc kubenswrapper[4749]: E0128 19:50:21.873711 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:50:35 crc kubenswrapper[4749]: I0128 19:50:35.872209 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:50:35 crc kubenswrapper[4749]: E0128 19:50:35.873247 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:50:49 crc kubenswrapper[4749]: I0128 19:50:49.872920 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:50:49 crc kubenswrapper[4749]: E0128 19:50:49.873719 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:50:53 crc kubenswrapper[4749]: I0128 19:50:53.715202 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-68fb6b79f7-lxql7" podUID="5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 28 19:51:01 crc kubenswrapper[4749]: I0128 19:51:01.872256 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:51:01 crc kubenswrapper[4749]: E0128 19:51:01.873186 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:51:15 crc kubenswrapper[4749]: I0128 19:51:15.871974 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:51:15 crc kubenswrapper[4749]: E0128 19:51:15.873132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:51:30 crc kubenswrapper[4749]: I0128 19:51:30.880198 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:51:30 crc kubenswrapper[4749]: E0128 19:51:30.884934 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:51:41 crc kubenswrapper[4749]: I0128 19:51:41.872048 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:51:41 crc kubenswrapper[4749]: E0128 19:51:41.872966 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.370420 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hqvxc"] Jan 28 19:51:50 crc kubenswrapper[4749]: E0128 19:51:50.371778 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e931e7-ccd2-46c4-b506-9d1a9d8a813c" containerName="collect-profiles" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.371798 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e931e7-ccd2-46c4-b506-9d1a9d8a813c" containerName="collect-profiles" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.372065 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e931e7-ccd2-46c4-b506-9d1a9d8a813c" containerName="collect-profiles" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.373893 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.389357 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqvxc"] Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.510554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-utilities\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.510641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-catalog-content\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.510726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnltz\" (UniqueName: \"kubernetes.io/projected/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-kube-api-access-vnltz\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.613163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-utilities\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.613284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-catalog-content\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.613428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnltz\" (UniqueName: \"kubernetes.io/projected/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-kube-api-access-vnltz\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.613723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-utilities\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.613909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-catalog-content\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.643473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnltz\" (UniqueName: \"kubernetes.io/projected/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-kube-api-access-vnltz\") pod \"community-operators-hqvxc\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:50 crc kubenswrapper[4749]: I0128 19:51:50.700629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:51:51 crc kubenswrapper[4749]: I0128 19:51:51.483180 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqvxc"] Jan 28 19:51:52 crc kubenswrapper[4749]: I0128 19:51:52.470117 4749 generic.go:334] "Generic (PLEG): container finished" podID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerID="d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8" exitCode=0 Jan 28 19:51:52 crc kubenswrapper[4749]: I0128 19:51:52.470179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqvxc" event={"ID":"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9","Type":"ContainerDied","Data":"d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8"} Jan 28 19:51:52 crc kubenswrapper[4749]: I0128 19:51:52.470453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqvxc" event={"ID":"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9","Type":"ContainerStarted","Data":"2236fe89dfd3bc2932f2707072288cf3ef22a2104feb38373dcf8e3d21627422"} Jan 28 19:51:52 crc kubenswrapper[4749]: I0128 19:51:52.474604 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 19:51:54 crc kubenswrapper[4749]: I0128 19:51:54.492997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqvxc" event={"ID":"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9","Type":"ContainerStarted","Data":"fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155"} Jan 28 19:51:54 crc kubenswrapper[4749]: I0128 19:51:54.872120 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:51:54 crc kubenswrapper[4749]: E0128 19:51:54.872469 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:51:59 crc kubenswrapper[4749]: I0128 19:51:59.583979 4749 generic.go:334] "Generic (PLEG): container finished" podID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerID="fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155" exitCode=0 Jan 28 19:51:59 crc kubenswrapper[4749]: I0128 19:51:59.584186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqvxc" event={"ID":"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9","Type":"ContainerDied","Data":"fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155"} Jan 28 19:52:01 crc kubenswrapper[4749]: I0128 19:52:01.607005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqvxc" event={"ID":"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9","Type":"ContainerStarted","Data":"f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015"} Jan 28 19:52:01 crc kubenswrapper[4749]: I0128 19:52:01.632256 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hqvxc" podStartSLOduration=3.632272229 podStartE2EDuration="11.632233501s" podCreationTimestamp="2026-01-28 19:51:50 +0000 UTC" firstStartedPulling="2026-01-28 19:51:52.474347023 +0000 UTC m=+4580.485873798" lastFinishedPulling="2026-01-28 19:52:00.474308295 +0000 UTC m=+4588.485835070" observedRunningTime="2026-01-28 19:52:01.629396622 +0000 UTC m=+4589.640923427" watchObservedRunningTime="2026-01-28 19:52:01.632233501 +0000 UTC m=+4589.643760276" Jan 28 19:52:05 crc kubenswrapper[4749]: I0128 19:52:05.871853 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:52:05 crc kubenswrapper[4749]: E0128 19:52:05.874132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:52:10 crc kubenswrapper[4749]: I0128 19:52:10.723181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:52:10 crc kubenswrapper[4749]: I0128 19:52:10.723855 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:52:10 crc kubenswrapper[4749]: I0128 19:52:10.750846 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:52:11 crc kubenswrapper[4749]: I0128 19:52:11.748357 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:52:11 crc kubenswrapper[4749]: I0128 19:52:11.799486 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hqvxc"] Jan 28 19:52:13 crc kubenswrapper[4749]: I0128 19:52:13.718881 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hqvxc" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerName="registry-server" containerID="cri-o://f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015" gracePeriod=2 Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.283288 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.389135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-catalog-content\") pod \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.389239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnltz\" (UniqueName: \"kubernetes.io/projected/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-kube-api-access-vnltz\") pod \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.389338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-utilities\") pod \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\" (UID: \"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9\") " Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.390677 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-utilities" (OuterVolumeSpecName: "utilities") pod "09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" (UID: "09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.399606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-kube-api-access-vnltz" (OuterVolumeSpecName: "kube-api-access-vnltz") pod "09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" (UID: "09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9"). InnerVolumeSpecName "kube-api-access-vnltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.443799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" (UID: "09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.492813 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.492852 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnltz\" (UniqueName: \"kubernetes.io/projected/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-kube-api-access-vnltz\") on node \"crc\" DevicePath \"\"" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.492865 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.730229 4749 generic.go:334] "Generic (PLEG): container finished" podID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerID="f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015" exitCode=0 Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.730284 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqvxc" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.730536 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqvxc" event={"ID":"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9","Type":"ContainerDied","Data":"f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015"} Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.730662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqvxc" event={"ID":"09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9","Type":"ContainerDied","Data":"2236fe89dfd3bc2932f2707072288cf3ef22a2104feb38373dcf8e3d21627422"} Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.730701 4749 scope.go:117] "RemoveContainer" containerID="f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.759537 4749 scope.go:117] "RemoveContainer" containerID="fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.774299 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hqvxc"] Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.784259 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hqvxc"] Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.796873 4749 scope.go:117] "RemoveContainer" containerID="d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.839962 4749 scope.go:117] "RemoveContainer" containerID="f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015" Jan 28 19:52:14 crc kubenswrapper[4749]: E0128 19:52:14.840521 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015\": container with ID starting with f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015 not found: ID does not exist" containerID="f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.840580 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015"} err="failed to get container status \"f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015\": rpc error: code = NotFound desc = could not find container \"f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015\": container with ID starting with f207905c939fc7c6be1bbb2edd4bbcc1ff9f05344bb529684e460e40f5d3d015 not found: ID does not exist" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.840614 4749 scope.go:117] "RemoveContainer" containerID="fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155" Jan 28 19:52:14 crc kubenswrapper[4749]: E0128 19:52:14.841054 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155\": container with ID starting with fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155 not found: ID does not exist" containerID="fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.841096 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155"} err="failed to get container status \"fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155\": rpc error: code = NotFound desc = could not find container \"fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155\": container with ID starting with fe02088661d9368385bcbfbc2ca0d7086c5c16c8d78ed532d7f1544166cac155 not found: ID does not exist" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.841123 4749 scope.go:117] "RemoveContainer" containerID="d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8" Jan 28 19:52:14 crc kubenswrapper[4749]: E0128 19:52:14.841391 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8\": container with ID starting with d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8 not found: ID does not exist" containerID="d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.841416 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8"} err="failed to get container status \"d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8\": rpc error: code = NotFound desc = could not find container \"d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8\": container with ID starting with d6cdc5157e0170fa5364e71510b602a71806e6d5e1451d5eadcbcf5a3ac35fa8 not found: ID does not exist" Jan 28 19:52:14 crc kubenswrapper[4749]: I0128 19:52:14.886676 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" path="/var/lib/kubelet/pods/09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9/volumes" Jan 28 19:52:20 crc kubenswrapper[4749]: I0128 19:52:20.873400 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:52:20 crc kubenswrapper[4749]: E0128 19:52:20.874223 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:52:32 crc kubenswrapper[4749]: I0128 19:52:32.882536 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:52:32 crc kubenswrapper[4749]: E0128 19:52:32.883823 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:52:45 crc kubenswrapper[4749]: I0128 19:52:45.872439 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:52:45 crc kubenswrapper[4749]: E0128 19:52:45.873262 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:53:00 crc kubenswrapper[4749]: I0128 19:53:00.872049 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:53:01 crc kubenswrapper[4749]: I0128 19:53:01.233387 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"235aa65ffdcac809260e67b0759c91f4859ee458a18b8ff92e354cd57d7184b7"} Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.202892 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hvx7j"] Jan 28 19:53:24 crc kubenswrapper[4749]: E0128 19:53:24.204147 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerName="extract-content" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.204168 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerName="extract-content" Jan 28 19:53:24 crc kubenswrapper[4749]: E0128 19:53:24.204191 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerName="registry-server" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.204199 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerName="registry-server" Jan 28 19:53:24 crc kubenswrapper[4749]: E0128 19:53:24.204252 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerName="extract-utilities" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.204261 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerName="extract-utilities" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.204654 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a8fcd2-c46f-4c21-8fa0-e66e8adb5bf9" containerName="registry-server" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.206854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.229154 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvx7j"] Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.327223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-catalog-content\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.327571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-utilities\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.327613 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgjvw\" (UniqueName: \"kubernetes.io/projected/02424a4b-86f6-4070-b52b-fa81b9c3ee31-kube-api-access-lgjvw\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.429718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-catalog-content\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.429806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-utilities\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.429831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgjvw\" (UniqueName: \"kubernetes.io/projected/02424a4b-86f6-4070-b52b-fa81b9c3ee31-kube-api-access-lgjvw\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.430306 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-catalog-content\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.430535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-utilities\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.451611 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgjvw\" (UniqueName: \"kubernetes.io/projected/02424a4b-86f6-4070-b52b-fa81b9c3ee31-kube-api-access-lgjvw\") pod \"redhat-operators-hvx7j\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:24 crc kubenswrapper[4749]: I0128 19:53:24.533652 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:25 crc kubenswrapper[4749]: I0128 19:53:25.088070 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvx7j"] Jan 28 19:53:25 crc kubenswrapper[4749]: I0128 19:53:25.487223 4749 generic.go:334] "Generic (PLEG): container finished" podID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerID="e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40" exitCode=0 Jan 28 19:53:25 crc kubenswrapper[4749]: I0128 19:53:25.487281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvx7j" event={"ID":"02424a4b-86f6-4070-b52b-fa81b9c3ee31","Type":"ContainerDied","Data":"e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40"} Jan 28 19:53:25 crc kubenswrapper[4749]: I0128 19:53:25.487315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvx7j" event={"ID":"02424a4b-86f6-4070-b52b-fa81b9c3ee31","Type":"ContainerStarted","Data":"c23c2b7905d47c30efc7c750f7ee801ab0c0926f9f361a138664a25047a3bfe0"} Jan 28 19:53:27 crc kubenswrapper[4749]: I0128 19:53:27.509435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvx7j" event={"ID":"02424a4b-86f6-4070-b52b-fa81b9c3ee31","Type":"ContainerStarted","Data":"2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415"} Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.108853 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xrbzx"] Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.112433 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.120734 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xrbzx"] Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.236235 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-utilities\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.236561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-catalog-content\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.236725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9tp\" (UniqueName: \"kubernetes.io/projected/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-kube-api-access-4c9tp\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.339059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-utilities\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.339242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-catalog-content\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.339343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9tp\" (UniqueName: \"kubernetes.io/projected/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-kube-api-access-4c9tp\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.339513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-utilities\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.339557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-catalog-content\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.372472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9tp\" (UniqueName: \"kubernetes.io/projected/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-kube-api-access-4c9tp\") pod \"certified-operators-xrbzx\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.451033 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.605058 4749 generic.go:334] "Generic (PLEG): container finished" podID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerID="2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415" exitCode=0 Jan 28 19:53:32 crc kubenswrapper[4749]: I0128 19:53:32.605347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvx7j" event={"ID":"02424a4b-86f6-4070-b52b-fa81b9c3ee31","Type":"ContainerDied","Data":"2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415"} Jan 28 19:53:33 crc kubenswrapper[4749]: I0128 19:53:33.058921 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xrbzx"] Jan 28 19:53:33 crc kubenswrapper[4749]: I0128 19:53:33.618787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvx7j" event={"ID":"02424a4b-86f6-4070-b52b-fa81b9c3ee31","Type":"ContainerStarted","Data":"568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9"} Jan 28 19:53:33 crc kubenswrapper[4749]: I0128 19:53:33.620541 4749 generic.go:334] "Generic (PLEG): container finished" podID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerID="f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1" exitCode=0 Jan 28 19:53:33 crc kubenswrapper[4749]: I0128 19:53:33.620576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrbzx" event={"ID":"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec","Type":"ContainerDied","Data":"f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1"} Jan 28 19:53:33 crc kubenswrapper[4749]: I0128 19:53:33.620611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrbzx" event={"ID":"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec","Type":"ContainerStarted","Data":"44bdf1f31c7f28735febfc60f95599b308fc247ccbca88446150b4f0e199ffa5"} Jan 28 19:53:33 crc kubenswrapper[4749]: I0128 19:53:33.647959 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hvx7j" podStartSLOduration=2.113166963 podStartE2EDuration="9.647927857s" podCreationTimestamp="2026-01-28 19:53:24 +0000 UTC" firstStartedPulling="2026-01-28 19:53:25.489144368 +0000 UTC m=+4673.500671153" lastFinishedPulling="2026-01-28 19:53:33.023905272 +0000 UTC m=+4681.035432047" observedRunningTime="2026-01-28 19:53:33.638452126 +0000 UTC m=+4681.649978911" watchObservedRunningTime="2026-01-28 19:53:33.647927857 +0000 UTC m=+4681.659454632" Jan 28 19:53:34 crc kubenswrapper[4749]: I0128 19:53:34.532624 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:34 crc kubenswrapper[4749]: I0128 19:53:34.534852 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:53:34 crc kubenswrapper[4749]: I0128 19:53:34.633546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrbzx" event={"ID":"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec","Type":"ContainerStarted","Data":"1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a"} Jan 28 19:53:35 crc kubenswrapper[4749]: I0128 19:53:35.584849 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvx7j" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="registry-server" probeResult="failure" output=< Jan 28 19:53:35 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:53:35 crc kubenswrapper[4749]: > Jan 28 19:53:36 crc kubenswrapper[4749]: I0128 19:53:36.652695 4749 generic.go:334] "Generic (PLEG): container finished" podID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerID="1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a" exitCode=0 Jan 28 19:53:36 crc kubenswrapper[4749]: I0128 19:53:36.652770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrbzx" event={"ID":"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec","Type":"ContainerDied","Data":"1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a"} Jan 28 19:53:37 crc kubenswrapper[4749]: I0128 19:53:37.665704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrbzx" event={"ID":"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec","Type":"ContainerStarted","Data":"fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705"} Jan 28 19:53:37 crc kubenswrapper[4749]: I0128 19:53:37.699192 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xrbzx" podStartSLOduration=2.252706573 podStartE2EDuration="5.699175237s" podCreationTimestamp="2026-01-28 19:53:32 +0000 UTC" firstStartedPulling="2026-01-28 19:53:33.630282077 +0000 UTC m=+4681.641808852" lastFinishedPulling="2026-01-28 19:53:37.076750741 +0000 UTC m=+4685.088277516" observedRunningTime="2026-01-28 19:53:37.693311024 +0000 UTC m=+4685.704837829" watchObservedRunningTime="2026-01-28 19:53:37.699175237 +0000 UTC m=+4685.710702012" Jan 28 19:53:42 crc kubenswrapper[4749]: I0128 19:53:42.451809 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:42 crc kubenswrapper[4749]: I0128 19:53:42.453094 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:42 crc kubenswrapper[4749]: I0128 19:53:42.504075 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:42 crc kubenswrapper[4749]: I0128 19:53:42.763116 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:42 crc kubenswrapper[4749]: I0128 19:53:42.818180 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xrbzx"] Jan 28 19:53:44 crc kubenswrapper[4749]: I0128 19:53:44.732569 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xrbzx" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerName="registry-server" containerID="cri-o://fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705" gracePeriod=2 Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.309969 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.467130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c9tp\" (UniqueName: \"kubernetes.io/projected/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-kube-api-access-4c9tp\") pod \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.467443 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-catalog-content\") pod \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.467669 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-utilities\") pod \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\" (UID: \"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec\") " Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.468583 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-utilities" (OuterVolumeSpecName: "utilities") pod "6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" (UID: "6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.475200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-kube-api-access-4c9tp" (OuterVolumeSpecName: "kube-api-access-4c9tp") pod "6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" (UID: "6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec"). InnerVolumeSpecName "kube-api-access-4c9tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.525285 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" (UID: "6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.570745 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c9tp\" (UniqueName: \"kubernetes.io/projected/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-kube-api-access-4c9tp\") on node \"crc\" DevicePath \"\"" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.570810 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.570820 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.580073 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvx7j" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="registry-server" probeResult="failure" output=< Jan 28 19:53:45 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:53:45 crc kubenswrapper[4749]: > Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.744223 4749 generic.go:334] "Generic (PLEG): container finished" podID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerID="fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705" exitCode=0 Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.744264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrbzx" event={"ID":"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec","Type":"ContainerDied","Data":"fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705"} Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.744289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrbzx" event={"ID":"6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec","Type":"ContainerDied","Data":"44bdf1f31c7f28735febfc60f95599b308fc247ccbca88446150b4f0e199ffa5"} Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.744339 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrbzx" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.744320 4749 scope.go:117] "RemoveContainer" containerID="fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.776068 4749 scope.go:117] "RemoveContainer" containerID="1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.785396 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xrbzx"] Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.794494 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xrbzx"] Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.804572 4749 scope.go:117] "RemoveContainer" containerID="f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.856781 4749 scope.go:117] "RemoveContainer" containerID="fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705" Jan 28 19:53:45 crc kubenswrapper[4749]: E0128 19:53:45.857340 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705\": container with ID starting with fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705 not found: ID does not exist" containerID="fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.857378 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705"} err="failed to get container status \"fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705\": rpc error: code = NotFound desc = could not find container \"fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705\": container with ID starting with fe0a353e11e132ce7d9c1a0784c9cf2720ddcee87325d37bbf01215b3a8c6705 not found: ID does not exist" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.857402 4749 scope.go:117] "RemoveContainer" containerID="1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a" Jan 28 19:53:45 crc kubenswrapper[4749]: E0128 19:53:45.857815 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a\": container with ID starting with 1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a not found: ID does not exist" containerID="1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.857856 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a"} err="failed to get container status \"1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a\": rpc error: code = NotFound desc = could not find container \"1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a\": container with ID starting with 1521ff0a81e23f073f24ae00dad69d2e5fa8e3ac7fec8120b88db2b62a39022a not found: ID does not exist" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.857885 4749 scope.go:117] "RemoveContainer" containerID="f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1" Jan 28 19:53:45 crc kubenswrapper[4749]: E0128 19:53:45.858219 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1\": container with ID starting with f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1 not found: ID does not exist" containerID="f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1" Jan 28 19:53:45 crc kubenswrapper[4749]: I0128 19:53:45.858251 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1"} err="failed to get container status \"f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1\": rpc error: code = NotFound desc = could not find container \"f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1\": container with ID starting with f987a5af8d9291007e50c811858e722e22ac2118eb6ecf83725e455504071cb1 not found: ID does not exist" Jan 28 19:53:46 crc kubenswrapper[4749]: I0128 19:53:46.887456 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" path="/var/lib/kubelet/pods/6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec/volumes" Jan 28 19:53:55 crc kubenswrapper[4749]: I0128 19:53:55.581202 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvx7j" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="registry-server" probeResult="failure" output=< Jan 28 19:53:55 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 19:53:55 crc kubenswrapper[4749]: > Jan 28 19:54:04 crc kubenswrapper[4749]: I0128 19:54:04.708105 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:54:04 crc kubenswrapper[4749]: I0128 19:54:04.755679 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:54:04 crc kubenswrapper[4749]: I0128 19:54:04.950932 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvx7j"] Jan 28 19:54:05 crc kubenswrapper[4749]: I0128 19:54:05.973040 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hvx7j" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="registry-server" containerID="cri-o://568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9" gracePeriod=2 Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.462221 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.517624 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-catalog-content\") pod \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.517779 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgjvw\" (UniqueName: \"kubernetes.io/projected/02424a4b-86f6-4070-b52b-fa81b9c3ee31-kube-api-access-lgjvw\") pod \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.518000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-utilities\") pod \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\" (UID: \"02424a4b-86f6-4070-b52b-fa81b9c3ee31\") " Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.518519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-utilities" (OuterVolumeSpecName: "utilities") pod "02424a4b-86f6-4070-b52b-fa81b9c3ee31" (UID: "02424a4b-86f6-4070-b52b-fa81b9c3ee31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.524942 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02424a4b-86f6-4070-b52b-fa81b9c3ee31-kube-api-access-lgjvw" (OuterVolumeSpecName: "kube-api-access-lgjvw") pod "02424a4b-86f6-4070-b52b-fa81b9c3ee31" (UID: "02424a4b-86f6-4070-b52b-fa81b9c3ee31"). InnerVolumeSpecName "kube-api-access-lgjvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.621009 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.621319 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgjvw\" (UniqueName: \"kubernetes.io/projected/02424a4b-86f6-4070-b52b-fa81b9c3ee31-kube-api-access-lgjvw\") on node \"crc\" DevicePath \"\"" Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.641163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02424a4b-86f6-4070-b52b-fa81b9c3ee31" (UID: "02424a4b-86f6-4070-b52b-fa81b9c3ee31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.724716 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02424a4b-86f6-4070-b52b-fa81b9c3ee31-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.985034 4749 generic.go:334] "Generic (PLEG): container finished" podID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerID="568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9" exitCode=0 Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.985079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvx7j" event={"ID":"02424a4b-86f6-4070-b52b-fa81b9c3ee31","Type":"ContainerDied","Data":"568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9"} Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.985105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvx7j" event={"ID":"02424a4b-86f6-4070-b52b-fa81b9c3ee31","Type":"ContainerDied","Data":"c23c2b7905d47c30efc7c750f7ee801ab0c0926f9f361a138664a25047a3bfe0"} Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.985122 4749 scope.go:117] "RemoveContainer" containerID="568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9" Jan 28 19:54:06 crc kubenswrapper[4749]: I0128 19:54:06.985287 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvx7j" Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.012907 4749 scope.go:117] "RemoveContainer" containerID="2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415" Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.018558 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvx7j"] Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.033721 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hvx7j"] Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.047499 4749 scope.go:117] "RemoveContainer" containerID="e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40" Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.085858 4749 scope.go:117] "RemoveContainer" containerID="568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9" Jan 28 19:54:07 crc kubenswrapper[4749]: E0128 19:54:07.086319 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9\": container with ID starting with 568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9 not found: ID does not exist" containerID="568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9" Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.086379 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9"} err="failed to get container status \"568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9\": rpc error: code = NotFound desc = could not find container \"568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9\": container with ID starting with 568eb6c5dc361542bb680afdcb9c6e615836d5ab66c0cbf3a8e335dd0d6cbfe9 not found: ID does not exist" Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.086405 4749 scope.go:117] "RemoveContainer" containerID="2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415" Jan 28 19:54:07 crc kubenswrapper[4749]: E0128 19:54:07.087010 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415\": container with ID starting with 2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415 not found: ID does not exist" containerID="2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415" Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.087036 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415"} err="failed to get container status \"2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415\": rpc error: code = NotFound desc = could not find container \"2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415\": container with ID starting with 2175733c5c92f33e5bf88da2a4a594ff9040fcb18d6dde125817e71d2d0e1415 not found: ID does not exist" Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.087051 4749 scope.go:117] "RemoveContainer" containerID="e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40" Jan 28 19:54:07 crc kubenswrapper[4749]: E0128 19:54:07.087457 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40\": container with ID starting with e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40 not found: ID does not exist" containerID="e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40" Jan 28 19:54:07 crc kubenswrapper[4749]: I0128 19:54:07.087481 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40"} err="failed to get container status \"e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40\": rpc error: code = NotFound desc = could not find container \"e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40\": container with ID starting with e7227aac1b7331ba55c1a0f1bd6b2df76667c9e852e20a1f9a23309a1f8e1d40 not found: ID does not exist" Jan 28 19:54:08 crc kubenswrapper[4749]: I0128 19:54:08.888580 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" path="/var/lib/kubelet/pods/02424a4b-86f6-4070-b52b-fa81b9c3ee31/volumes" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.595040 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dghlm"] Jan 28 19:55:14 crc kubenswrapper[4749]: E0128 19:55:14.596052 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerName="extract-utilities" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.596066 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerName="extract-utilities" Jan 28 19:55:14 crc kubenswrapper[4749]: E0128 19:55:14.596077 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="registry-server" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.596085 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="registry-server" Jan 28 19:55:14 crc kubenswrapper[4749]: E0128 19:55:14.596109 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="extract-utilities" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.596115 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="extract-utilities" Jan 28 19:55:14 crc kubenswrapper[4749]: E0128 19:55:14.596127 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="extract-content" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.596135 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="extract-content" Jan 28 19:55:14 crc kubenswrapper[4749]: E0128 19:55:14.596146 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerName="extract-content" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.596153 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerName="extract-content" Jan 28 19:55:14 crc kubenswrapper[4749]: E0128 19:55:14.596166 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerName="registry-server" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.596172 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerName="registry-server" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.596427 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbe996e-3512-4dc6-bd06-82b8a0eaf0ec" containerName="registry-server" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.596454 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="02424a4b-86f6-4070-b52b-fa81b9c3ee31" containerName="registry-server" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.598197 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.606827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghlm"] Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.751082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bn64\" (UniqueName: \"kubernetes.io/projected/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-kube-api-access-6bn64\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.751190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-utilities\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.751219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-catalog-content\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.853490 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bn64\" (UniqueName: \"kubernetes.io/projected/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-kube-api-access-6bn64\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.853943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-utilities\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.854047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-catalog-content\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.854472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-catalog-content\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.854681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-utilities\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.879520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bn64\" (UniqueName: \"kubernetes.io/projected/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-kube-api-access-6bn64\") pod \"redhat-marketplace-dghlm\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:14 crc kubenswrapper[4749]: I0128 19:55:14.931698 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:15 crc kubenswrapper[4749]: I0128 19:55:15.454248 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghlm"] Jan 28 19:55:15 crc kubenswrapper[4749]: I0128 19:55:15.720604 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerID="136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc" exitCode=0 Jan 28 19:55:15 crc kubenswrapper[4749]: I0128 19:55:15.720657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghlm" event={"ID":"4f6b0e40-44fe-421f-a7a6-483c81c20a6c","Type":"ContainerDied","Data":"136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc"} Jan 28 19:55:15 crc kubenswrapper[4749]: I0128 19:55:15.720720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghlm" event={"ID":"4f6b0e40-44fe-421f-a7a6-483c81c20a6c","Type":"ContainerStarted","Data":"3abbc210b8bf6c75513db72e1fda7c81ce5c6f6ab063eb5a2c826476499a800f"} Jan 28 19:55:18 crc kubenswrapper[4749]: I0128 19:55:18.758769 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerID="80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6" exitCode=0 Jan 28 19:55:18 crc kubenswrapper[4749]: I0128 19:55:18.758861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghlm" event={"ID":"4f6b0e40-44fe-421f-a7a6-483c81c20a6c","Type":"ContainerDied","Data":"80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6"} Jan 28 19:55:19 crc kubenswrapper[4749]: I0128 19:55:19.769753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghlm" event={"ID":"4f6b0e40-44fe-421f-a7a6-483c81c20a6c","Type":"ContainerStarted","Data":"97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d"} Jan 28 19:55:19 crc kubenswrapper[4749]: I0128 19:55:19.797041 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dghlm" podStartSLOduration=2.295112165 podStartE2EDuration="5.797018728s" podCreationTimestamp="2026-01-28 19:55:14 +0000 UTC" firstStartedPulling="2026-01-28 19:55:15.722502012 +0000 UTC m=+4783.734028787" lastFinishedPulling="2026-01-28 19:55:19.224408575 +0000 UTC m=+4787.235935350" observedRunningTime="2026-01-28 19:55:19.785259522 +0000 UTC m=+4787.796786307" watchObservedRunningTime="2026-01-28 19:55:19.797018728 +0000 UTC m=+4787.808545513" Jan 28 19:55:24 crc kubenswrapper[4749]: I0128 19:55:24.931878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:24 crc kubenswrapper[4749]: I0128 19:55:24.932688 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:25 crc kubenswrapper[4749]: I0128 19:55:25.005991 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:25 crc kubenswrapper[4749]: I0128 19:55:25.868847 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:25 crc kubenswrapper[4749]: I0128 19:55:25.916416 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghlm"] Jan 28 19:55:27 crc kubenswrapper[4749]: I0128 19:55:27.467234 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:55:27 crc kubenswrapper[4749]: I0128 19:55:27.467668 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:55:27 crc kubenswrapper[4749]: I0128 19:55:27.838378 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dghlm" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerName="registry-server" containerID="cri-o://97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d" gracePeriod=2 Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.356879 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.399025 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-utilities\") pod \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.399159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bn64\" (UniqueName: \"kubernetes.io/projected/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-kube-api-access-6bn64\") pod \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.399193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-catalog-content\") pod \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\" (UID: \"4f6b0e40-44fe-421f-a7a6-483c81c20a6c\") " Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.401822 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-utilities" (OuterVolumeSpecName: "utilities") pod "4f6b0e40-44fe-421f-a7a6-483c81c20a6c" (UID: "4f6b0e40-44fe-421f-a7a6-483c81c20a6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.407651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-kube-api-access-6bn64" (OuterVolumeSpecName: "kube-api-access-6bn64") pod "4f6b0e40-44fe-421f-a7a6-483c81c20a6c" (UID: "4f6b0e40-44fe-421f-a7a6-483c81c20a6c"). InnerVolumeSpecName "kube-api-access-6bn64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.427315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f6b0e40-44fe-421f-a7a6-483c81c20a6c" (UID: "4f6b0e40-44fe-421f-a7a6-483c81c20a6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.504240 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.504290 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bn64\" (UniqueName: \"kubernetes.io/projected/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-kube-api-access-6bn64\") on node \"crc\" DevicePath \"\"" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.504306 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f6b0e40-44fe-421f-a7a6-483c81c20a6c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.863987 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerID="97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d" exitCode=0 Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.864084 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dghlm" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.864106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghlm" event={"ID":"4f6b0e40-44fe-421f-a7a6-483c81c20a6c","Type":"ContainerDied","Data":"97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d"} Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.864453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dghlm" event={"ID":"4f6b0e40-44fe-421f-a7a6-483c81c20a6c","Type":"ContainerDied","Data":"3abbc210b8bf6c75513db72e1fda7c81ce5c6f6ab063eb5a2c826476499a800f"} Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.864474 4749 scope.go:117] "RemoveContainer" containerID="97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.909241 4749 scope.go:117] "RemoveContainer" containerID="80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.961993 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghlm"] Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.966931 4749 scope.go:117] "RemoveContainer" containerID="136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc" Jan 28 19:55:28 crc kubenswrapper[4749]: I0128 19:55:28.978981 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dghlm"] Jan 28 19:55:29 crc kubenswrapper[4749]: I0128 19:55:29.008483 4749 scope.go:117] "RemoveContainer" containerID="97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d" Jan 28 19:55:29 crc kubenswrapper[4749]: E0128 19:55:29.009351 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d\": container with ID starting with 97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d not found: ID does not exist" containerID="97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d" Jan 28 19:55:29 crc kubenswrapper[4749]: I0128 19:55:29.009459 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d"} err="failed to get container status \"97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d\": rpc error: code = NotFound desc = could not find container \"97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d\": container with ID starting with 97288e90e41ebaead3b41a1c49e1b4cd09bb63e5cdc4ddd1fb8ebd9dd1f5ab3d not found: ID does not exist" Jan 28 19:55:29 crc kubenswrapper[4749]: I0128 19:55:29.009542 4749 scope.go:117] "RemoveContainer" containerID="80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6" Jan 28 19:55:29 crc kubenswrapper[4749]: E0128 19:55:29.010113 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6\": container with ID starting with 80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6 not found: ID does not exist" containerID="80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6" Jan 28 19:55:29 crc kubenswrapper[4749]: I0128 19:55:29.010188 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6"} err="failed to get container status \"80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6\": rpc error: code = NotFound desc = could not find container \"80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6\": container with ID starting with 80ecef3b0c7dc52899b5119d2e2d93df566ce3bf306d3fceeec02cd2dfa514d6 not found: ID does not exist" Jan 28 19:55:29 crc kubenswrapper[4749]: I0128 19:55:29.010231 4749 scope.go:117] "RemoveContainer" containerID="136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc" Jan 28 19:55:29 crc kubenswrapper[4749]: E0128 19:55:29.012548 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc\": container with ID starting with 136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc not found: ID does not exist" containerID="136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc" Jan 28 19:55:29 crc kubenswrapper[4749]: I0128 19:55:29.012597 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc"} err="failed to get container status \"136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc\": rpc error: code = NotFound desc = could not find container \"136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc\": container with ID starting with 136a1877965744272fa03ab4c065c20992d7b60abd8c542cec46c4676a15fcbc not found: ID does not exist" Jan 28 19:55:30 crc kubenswrapper[4749]: I0128 19:55:30.885919 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" path="/var/lib/kubelet/pods/4f6b0e40-44fe-421f-a7a6-483c81c20a6c/volumes" Jan 28 19:55:57 crc kubenswrapper[4749]: I0128 19:55:57.466825 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:55:57 crc kubenswrapper[4749]: I0128 19:55:57.467351 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:56:27 crc kubenswrapper[4749]: I0128 19:56:27.467120 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:56:27 crc kubenswrapper[4749]: I0128 19:56:27.467726 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:56:27 crc kubenswrapper[4749]: I0128 19:56:27.467816 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:56:27 crc kubenswrapper[4749]: I0128 19:56:27.468833 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"235aa65ffdcac809260e67b0759c91f4859ee458a18b8ff92e354cd57d7184b7"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:56:27 crc kubenswrapper[4749]: I0128 19:56:27.468898 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://235aa65ffdcac809260e67b0759c91f4859ee458a18b8ff92e354cd57d7184b7" gracePeriod=600 Jan 28 19:56:28 crc kubenswrapper[4749]: I0128 19:56:28.456980 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="235aa65ffdcac809260e67b0759c91f4859ee458a18b8ff92e354cd57d7184b7" exitCode=0 Jan 28 19:56:28 crc kubenswrapper[4749]: I0128 19:56:28.457048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"235aa65ffdcac809260e67b0759c91f4859ee458a18b8ff92e354cd57d7184b7"} Jan 28 19:56:28 crc kubenswrapper[4749]: I0128 19:56:28.457604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911"} Jan 28 19:56:28 crc kubenswrapper[4749]: I0128 19:56:28.457652 4749 scope.go:117] "RemoveContainer" containerID="1ec35c055ea0a457f79bd980f4c518aea2387af1a0ba114f836b1fbd0845458d" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.384347 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p5nz9/must-gather-rbf9q"] Jan 28 19:56:40 crc kubenswrapper[4749]: E0128 19:56:40.385319 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerName="registry-server" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.385350 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerName="registry-server" Jan 28 19:56:40 crc kubenswrapper[4749]: E0128 19:56:40.385386 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerName="extract-utilities" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.385392 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerName="extract-utilities" Jan 28 19:56:40 crc kubenswrapper[4749]: E0128 19:56:40.385410 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerName="extract-content" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.385416 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerName="extract-content" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.385612 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6b0e40-44fe-421f-a7a6-483c81c20a6c" containerName="registry-server" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.387526 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.390393 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p5nz9"/"kube-root-ca.crt" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.390720 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p5nz9"/"openshift-service-ca.crt" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.390986 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-p5nz9"/"default-dockercfg-m7xzg" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.398548 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p5nz9/must-gather-rbf9q"] Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.574682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b53256e-31af-4b31-908a-e0341acbf58a-must-gather-output\") pod \"must-gather-rbf9q\" (UID: \"6b53256e-31af-4b31-908a-e0341acbf58a\") " pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.574869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m64zw\" (UniqueName: \"kubernetes.io/projected/6b53256e-31af-4b31-908a-e0341acbf58a-kube-api-access-m64zw\") pod \"must-gather-rbf9q\" (UID: \"6b53256e-31af-4b31-908a-e0341acbf58a\") " pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.677665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m64zw\" (UniqueName: \"kubernetes.io/projected/6b53256e-31af-4b31-908a-e0341acbf58a-kube-api-access-m64zw\") pod \"must-gather-rbf9q\" (UID: \"6b53256e-31af-4b31-908a-e0341acbf58a\") " pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.677888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b53256e-31af-4b31-908a-e0341acbf58a-must-gather-output\") pod \"must-gather-rbf9q\" (UID: \"6b53256e-31af-4b31-908a-e0341acbf58a\") " pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 19:56:40 crc kubenswrapper[4749]: I0128 19:56:40.678463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b53256e-31af-4b31-908a-e0341acbf58a-must-gather-output\") pod \"must-gather-rbf9q\" (UID: \"6b53256e-31af-4b31-908a-e0341acbf58a\") " pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 19:56:41 crc kubenswrapper[4749]: I0128 19:56:41.158526 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m64zw\" (UniqueName: \"kubernetes.io/projected/6b53256e-31af-4b31-908a-e0341acbf58a-kube-api-access-m64zw\") pod \"must-gather-rbf9q\" (UID: \"6b53256e-31af-4b31-908a-e0341acbf58a\") " pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 19:56:41 crc kubenswrapper[4749]: I0128 19:56:41.313650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 19:56:41 crc kubenswrapper[4749]: I0128 19:56:41.774284 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p5nz9/must-gather-rbf9q"] Jan 28 19:56:42 crc kubenswrapper[4749]: I0128 19:56:42.638032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" event={"ID":"6b53256e-31af-4b31-908a-e0341acbf58a","Type":"ContainerStarted","Data":"189b89cf16cac7ee0a3a244ea6113277488c8ec09887cba07ecce3afbcdddf5f"} Jan 28 19:56:50 crc kubenswrapper[4749]: I0128 19:56:50.740666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" event={"ID":"6b53256e-31af-4b31-908a-e0341acbf58a","Type":"ContainerStarted","Data":"86e914daf6c43a11a5468bc809932f11059f841ca66847c98fb71dbf67aec2e1"} Jan 28 19:56:50 crc kubenswrapper[4749]: I0128 19:56:50.741223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" event={"ID":"6b53256e-31af-4b31-908a-e0341acbf58a","Type":"ContainerStarted","Data":"3cb8995d08f92a29ffd672daf9e937e1f0a1ffec79603aaa9c4d7df779f58610"} Jan 28 19:56:50 crc kubenswrapper[4749]: I0128 19:56:50.760816 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" podStartSLOduration=3.025617233 podStartE2EDuration="10.760798724s" podCreationTimestamp="2026-01-28 19:56:40 +0000 UTC" firstStartedPulling="2026-01-28 19:56:41.784651623 +0000 UTC m=+4869.796178398" lastFinishedPulling="2026-01-28 19:56:49.519833114 +0000 UTC m=+4877.531359889" observedRunningTime="2026-01-28 19:56:50.75655378 +0000 UTC m=+4878.768080555" watchObservedRunningTime="2026-01-28 19:56:50.760798724 +0000 UTC m=+4878.772325499" Jan 28 19:56:55 crc kubenswrapper[4749]: E0128 19:56:55.005847 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.50:59016->38.102.83.50:34923: write tcp 38.102.83.50:59016->38.102.83.50:34923: write: connection reset by peer Jan 28 19:56:56 crc kubenswrapper[4749]: I0128 19:56:56.256764 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p5nz9/crc-debug-ldhnl"] Jan 28 19:56:56 crc kubenswrapper[4749]: I0128 19:56:56.259901 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:56:56 crc kubenswrapper[4749]: I0128 19:56:56.349011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmspx\" (UniqueName: \"kubernetes.io/projected/d9b21b7d-d676-4fc3-8010-57e4b33803b9-kube-api-access-jmspx\") pod \"crc-debug-ldhnl\" (UID: \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\") " pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:56:56 crc kubenswrapper[4749]: I0128 19:56:56.350265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9b21b7d-d676-4fc3-8010-57e4b33803b9-host\") pod \"crc-debug-ldhnl\" (UID: \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\") " pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:56:56 crc kubenswrapper[4749]: I0128 19:56:56.452879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmspx\" (UniqueName: \"kubernetes.io/projected/d9b21b7d-d676-4fc3-8010-57e4b33803b9-kube-api-access-jmspx\") pod \"crc-debug-ldhnl\" (UID: \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\") " pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:56:56 crc kubenswrapper[4749]: I0128 19:56:56.453417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9b21b7d-d676-4fc3-8010-57e4b33803b9-host\") pod \"crc-debug-ldhnl\" (UID: \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\") " pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:56:56 crc kubenswrapper[4749]: I0128 19:56:56.454370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9b21b7d-d676-4fc3-8010-57e4b33803b9-host\") pod \"crc-debug-ldhnl\" (UID: \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\") " pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:56:56 crc kubenswrapper[4749]: I0128 19:56:56.957253 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmspx\" (UniqueName: \"kubernetes.io/projected/d9b21b7d-d676-4fc3-8010-57e4b33803b9-kube-api-access-jmspx\") pod \"crc-debug-ldhnl\" (UID: \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\") " pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:56:57 crc kubenswrapper[4749]: I0128 19:56:57.183780 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:56:57 crc kubenswrapper[4749]: I0128 19:56:57.225754 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 19:56:57 crc kubenswrapper[4749]: I0128 19:56:57.807258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" event={"ID":"d9b21b7d-d676-4fc3-8010-57e4b33803b9","Type":"ContainerStarted","Data":"c65d309fee3ec21f3823bb840ca17f0ed2f113fe10bd0c4d2f9c27fc8b3f7763"} Jan 28 19:57:10 crc kubenswrapper[4749]: I0128 19:57:10.966395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" event={"ID":"d9b21b7d-d676-4fc3-8010-57e4b33803b9","Type":"ContainerStarted","Data":"6949859e06c30ce9ad6510a56c3d7a4cef58cac35b00ce47ff9065db179e27b7"} Jan 28 19:57:10 crc kubenswrapper[4749]: I0128 19:57:10.983417 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" podStartSLOduration=1.9835112910000001 podStartE2EDuration="14.983400955s" podCreationTimestamp="2026-01-28 19:56:56 +0000 UTC" firstStartedPulling="2026-01-28 19:56:57.225464262 +0000 UTC m=+4885.236991037" lastFinishedPulling="2026-01-28 19:57:10.225353926 +0000 UTC m=+4898.236880701" observedRunningTime="2026-01-28 19:57:10.978215787 +0000 UTC m=+4898.989742572" watchObservedRunningTime="2026-01-28 19:57:10.983400955 +0000 UTC m=+4898.994927730" Jan 28 19:57:31 crc kubenswrapper[4749]: I0128 19:57:31.178484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" event={"ID":"d9b21b7d-d676-4fc3-8010-57e4b33803b9","Type":"ContainerDied","Data":"6949859e06c30ce9ad6510a56c3d7a4cef58cac35b00ce47ff9065db179e27b7"} Jan 28 19:57:31 crc kubenswrapper[4749]: I0128 19:57:31.178452 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9b21b7d-d676-4fc3-8010-57e4b33803b9" containerID="6949859e06c30ce9ad6510a56c3d7a4cef58cac35b00ce47ff9065db179e27b7" exitCode=0 Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.351861 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.398186 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p5nz9/crc-debug-ldhnl"] Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.409785 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p5nz9/crc-debug-ldhnl"] Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.477456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmspx\" (UniqueName: \"kubernetes.io/projected/d9b21b7d-d676-4fc3-8010-57e4b33803b9-kube-api-access-jmspx\") pod \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\" (UID: \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\") " Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.477654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9b21b7d-d676-4fc3-8010-57e4b33803b9-host\") pod \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\" (UID: \"d9b21b7d-d676-4fc3-8010-57e4b33803b9\") " Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.477819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9b21b7d-d676-4fc3-8010-57e4b33803b9-host" (OuterVolumeSpecName: "host") pod "d9b21b7d-d676-4fc3-8010-57e4b33803b9" (UID: "d9b21b7d-d676-4fc3-8010-57e4b33803b9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.478749 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9b21b7d-d676-4fc3-8010-57e4b33803b9-host\") on node \"crc\" DevicePath \"\"" Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.489679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b21b7d-d676-4fc3-8010-57e4b33803b9-kube-api-access-jmspx" (OuterVolumeSpecName: "kube-api-access-jmspx") pod "d9b21b7d-d676-4fc3-8010-57e4b33803b9" (UID: "d9b21b7d-d676-4fc3-8010-57e4b33803b9"). InnerVolumeSpecName "kube-api-access-jmspx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.580922 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmspx\" (UniqueName: \"kubernetes.io/projected/d9b21b7d-d676-4fc3-8010-57e4b33803b9-kube-api-access-jmspx\") on node \"crc\" DevicePath \"\"" Jan 28 19:57:32 crc kubenswrapper[4749]: I0128 19:57:32.886342 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b21b7d-d676-4fc3-8010-57e4b33803b9" path="/var/lib/kubelet/pods/d9b21b7d-d676-4fc3-8010-57e4b33803b9/volumes" Jan 28 19:57:33 crc kubenswrapper[4749]: I0128 19:57:33.202514 4749 scope.go:117] "RemoveContainer" containerID="6949859e06c30ce9ad6510a56c3d7a4cef58cac35b00ce47ff9065db179e27b7" Jan 28 19:57:33 crc kubenswrapper[4749]: I0128 19:57:33.202580 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/crc-debug-ldhnl" Jan 28 19:57:33 crc kubenswrapper[4749]: I0128 19:57:33.961153 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p5nz9/crc-debug-f7jjz"] Jan 28 19:57:33 crc kubenswrapper[4749]: E0128 19:57:33.961904 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b21b7d-d676-4fc3-8010-57e4b33803b9" containerName="container-00" Jan 28 19:57:33 crc kubenswrapper[4749]: I0128 19:57:33.961917 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b21b7d-d676-4fc3-8010-57e4b33803b9" containerName="container-00" Jan 28 19:57:33 crc kubenswrapper[4749]: I0128 19:57:33.962138 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b21b7d-d676-4fc3-8010-57e4b33803b9" containerName="container-00" Jan 28 19:57:33 crc kubenswrapper[4749]: I0128 19:57:33.962921 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:34 crc kubenswrapper[4749]: I0128 19:57:34.016172 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f4ae246-00a0-41b3-aba8-d367d12f9796-host\") pod \"crc-debug-f7jjz\" (UID: \"0f4ae246-00a0-41b3-aba8-d367d12f9796\") " pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:34 crc kubenswrapper[4749]: I0128 19:57:34.016305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z6ls\" (UniqueName: \"kubernetes.io/projected/0f4ae246-00a0-41b3-aba8-d367d12f9796-kube-api-access-5z6ls\") pod \"crc-debug-f7jjz\" (UID: \"0f4ae246-00a0-41b3-aba8-d367d12f9796\") " pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:34 crc kubenswrapper[4749]: I0128 19:57:34.118783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f4ae246-00a0-41b3-aba8-d367d12f9796-host\") pod \"crc-debug-f7jjz\" (UID: \"0f4ae246-00a0-41b3-aba8-d367d12f9796\") " pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:34 crc kubenswrapper[4749]: I0128 19:57:34.118863 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z6ls\" (UniqueName: \"kubernetes.io/projected/0f4ae246-00a0-41b3-aba8-d367d12f9796-kube-api-access-5z6ls\") pod \"crc-debug-f7jjz\" (UID: \"0f4ae246-00a0-41b3-aba8-d367d12f9796\") " pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:34 crc kubenswrapper[4749]: I0128 19:57:34.118998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f4ae246-00a0-41b3-aba8-d367d12f9796-host\") pod \"crc-debug-f7jjz\" (UID: \"0f4ae246-00a0-41b3-aba8-d367d12f9796\") " pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:34 crc kubenswrapper[4749]: I0128 19:57:34.141860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z6ls\" (UniqueName: \"kubernetes.io/projected/0f4ae246-00a0-41b3-aba8-d367d12f9796-kube-api-access-5z6ls\") pod \"crc-debug-f7jjz\" (UID: \"0f4ae246-00a0-41b3-aba8-d367d12f9796\") " pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:34 crc kubenswrapper[4749]: I0128 19:57:34.282003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:35 crc kubenswrapper[4749]: I0128 19:57:35.225367 4749 generic.go:334] "Generic (PLEG): container finished" podID="0f4ae246-00a0-41b3-aba8-d367d12f9796" containerID="58ab41047098f79bec624dbb7e2f2ef3cce8e6feacc9580187c30b7e47f1cd96" exitCode=1 Jan 28 19:57:35 crc kubenswrapper[4749]: I0128 19:57:35.225458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" event={"ID":"0f4ae246-00a0-41b3-aba8-d367d12f9796","Type":"ContainerDied","Data":"58ab41047098f79bec624dbb7e2f2ef3cce8e6feacc9580187c30b7e47f1cd96"} Jan 28 19:57:35 crc kubenswrapper[4749]: I0128 19:57:35.225716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" event={"ID":"0f4ae246-00a0-41b3-aba8-d367d12f9796","Type":"ContainerStarted","Data":"9332b7cfbfac1766ea0fdbaa9ad98cc9e857876c6896a6f68efc0ceb130506ed"} Jan 28 19:57:35 crc kubenswrapper[4749]: I0128 19:57:35.271819 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p5nz9/crc-debug-f7jjz"] Jan 28 19:57:35 crc kubenswrapper[4749]: I0128 19:57:35.282805 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p5nz9/crc-debug-f7jjz"] Jan 28 19:57:36 crc kubenswrapper[4749]: I0128 19:57:36.366470 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:36 crc kubenswrapper[4749]: I0128 19:57:36.478866 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f4ae246-00a0-41b3-aba8-d367d12f9796-host\") pod \"0f4ae246-00a0-41b3-aba8-d367d12f9796\" (UID: \"0f4ae246-00a0-41b3-aba8-d367d12f9796\") " Jan 28 19:57:36 crc kubenswrapper[4749]: I0128 19:57:36.479018 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f4ae246-00a0-41b3-aba8-d367d12f9796-host" (OuterVolumeSpecName: "host") pod "0f4ae246-00a0-41b3-aba8-d367d12f9796" (UID: "0f4ae246-00a0-41b3-aba8-d367d12f9796"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 19:57:36 crc kubenswrapper[4749]: I0128 19:57:36.479086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z6ls\" (UniqueName: \"kubernetes.io/projected/0f4ae246-00a0-41b3-aba8-d367d12f9796-kube-api-access-5z6ls\") pod \"0f4ae246-00a0-41b3-aba8-d367d12f9796\" (UID: \"0f4ae246-00a0-41b3-aba8-d367d12f9796\") " Jan 28 19:57:36 crc kubenswrapper[4749]: I0128 19:57:36.479913 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f4ae246-00a0-41b3-aba8-d367d12f9796-host\") on node \"crc\" DevicePath \"\"" Jan 28 19:57:36 crc kubenswrapper[4749]: I0128 19:57:36.489744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f4ae246-00a0-41b3-aba8-d367d12f9796-kube-api-access-5z6ls" (OuterVolumeSpecName: "kube-api-access-5z6ls") pod "0f4ae246-00a0-41b3-aba8-d367d12f9796" (UID: "0f4ae246-00a0-41b3-aba8-d367d12f9796"). InnerVolumeSpecName "kube-api-access-5z6ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 19:57:36 crc kubenswrapper[4749]: I0128 19:57:36.582374 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z6ls\" (UniqueName: \"kubernetes.io/projected/0f4ae246-00a0-41b3-aba8-d367d12f9796-kube-api-access-5z6ls\") on node \"crc\" DevicePath \"\"" Jan 28 19:57:36 crc kubenswrapper[4749]: I0128 19:57:36.884542 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f4ae246-00a0-41b3-aba8-d367d12f9796" path="/var/lib/kubelet/pods/0f4ae246-00a0-41b3-aba8-d367d12f9796/volumes" Jan 28 19:57:37 crc kubenswrapper[4749]: I0128 19:57:37.244155 4749 scope.go:117] "RemoveContainer" containerID="58ab41047098f79bec624dbb7e2f2ef3cce8e6feacc9580187c30b7e47f1cd96" Jan 28 19:57:37 crc kubenswrapper[4749]: I0128 19:57:37.244206 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/crc-debug-f7jjz" Jan 28 19:57:41 crc kubenswrapper[4749]: E0128 19:57:41.961268 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:57:42 crc kubenswrapper[4749]: E0128 19:57:42.127640 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:57:48 crc kubenswrapper[4749]: E0128 19:57:48.246417 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:57:48 crc kubenswrapper[4749]: E0128 19:57:48.246488 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:57:52 crc kubenswrapper[4749]: E0128 19:57:52.007764 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:57:57 crc kubenswrapper[4749]: E0128 19:57:57.396501 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:58:02 crc kubenswrapper[4749]: E0128 19:58:02.056834 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:58:12 crc kubenswrapper[4749]: E0128 19:58:12.256186 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:58:12 crc kubenswrapper[4749]: E0128 19:58:12.256188 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:58:22 crc kubenswrapper[4749]: E0128 19:58:22.563141 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:58:27 crc kubenswrapper[4749]: E0128 19:58:27.132897 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:58:27 crc kubenswrapper[4749]: I0128 19:58:27.466990 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:58:27 crc kubenswrapper[4749]: I0128 19:58:27.467059 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:58:32 crc kubenswrapper[4749]: I0128 19:58:32.622747 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_36a7915e-b865-4fbc-9e00-2a691100f162/aodh-api/0.log" Jan 28 19:58:32 crc kubenswrapper[4749]: I0128 19:58:32.798868 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_36a7915e-b865-4fbc-9e00-2a691100f162/aodh-listener/0.log" Jan 28 19:58:32 crc kubenswrapper[4749]: I0128 19:58:32.855815 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_36a7915e-b865-4fbc-9e00-2a691100f162/aodh-evaluator/0.log" Jan 28 19:58:32 crc kubenswrapper[4749]: I0128 19:58:32.860550 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_36a7915e-b865-4fbc-9e00-2a691100f162/aodh-notifier/0.log" Jan 28 19:58:32 crc kubenswrapper[4749]: E0128 19:58:32.946729 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b21b7d_d676_4fc3_8010_57e4b33803b9.slice\": RecentStats: unable to find data in memory cache]" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.043102 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5789584db6-28j8v_2cd564aa-37ee-4f52-b9c4-931550ef0aed/barbican-api/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.107725 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5789584db6-28j8v_2cd564aa-37ee-4f52-b9c4-931550ef0aed/barbican-api-log/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.154091 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-585f7f57cd-7qgls_591ffff8-803a-414b-b863-e3978dee85ce/barbican-keystone-listener/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.281522 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-585f7f57cd-7qgls_591ffff8-803a-414b-b863-e3978dee85ce/barbican-keystone-listener-log/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.301848 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7bb49d4f49-jkfrz_3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d/barbican-worker/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.359004 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7bb49d4f49-jkfrz_3f345fa0-96cc-4b6f-8e95-0f9d201aeb4d/barbican-worker-log/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.550010 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3212c9f3-5620-46b0-bece-ec7ea4b9763a/ceilometer-notification-agent/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.556723 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3212c9f3-5620-46b0-bece-ec7ea4b9763a/ceilometer-central-agent/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.588188 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3212c9f3-5620-46b0-bece-ec7ea4b9763a/proxy-httpd/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.742692 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3212c9f3-5620-46b0-bece-ec7ea4b9763a/sg-core/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.803392 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c145454f-dbd9-46ff-b588-6c86198dc2e6/cinder-api-log/0.log" Jan 28 19:58:33 crc kubenswrapper[4749]: I0128 19:58:33.847838 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c145454f-dbd9-46ff-b588-6c86198dc2e6/cinder-api/0.log" Jan 28 19:58:34 crc kubenswrapper[4749]: I0128 19:58:34.600444 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f31cc43a-6cff-4aec-8306-e6d0eb59f973/cinder-scheduler/0.log" Jan 28 19:58:34 crc kubenswrapper[4749]: I0128 19:58:34.601519 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f31cc43a-6cff-4aec-8306-e6d0eb59f973/probe/0.log" Jan 28 19:58:34 crc kubenswrapper[4749]: I0128 19:58:34.779538 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-hx6rs_5aeab955-536d-4525-adac-04c683c92aeb/init/0.log" Jan 28 19:58:34 crc kubenswrapper[4749]: I0128 19:58:34.966469 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-hx6rs_5aeab955-536d-4525-adac-04c683c92aeb/init/0.log" Jan 28 19:58:34 crc kubenswrapper[4749]: I0128 19:58:34.977086 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-hx6rs_5aeab955-536d-4525-adac-04c683c92aeb/dnsmasq-dns/0.log" Jan 28 19:58:35 crc kubenswrapper[4749]: I0128 19:58:35.034139 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a39b5ea6-cb49-4c58-85c4-f9b274ec979b/glance-httpd/0.log" Jan 28 19:58:35 crc kubenswrapper[4749]: I0128 19:58:35.158846 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a39b5ea6-cb49-4c58-85c4-f9b274ec979b/glance-log/0.log" Jan 28 19:58:35 crc kubenswrapper[4749]: I0128 19:58:35.482891 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_35c07eea-b699-4b39-b6db-ad0a9536ebe4/glance-log/0.log" Jan 28 19:58:35 crc kubenswrapper[4749]: I0128 19:58:35.585146 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_35c07eea-b699-4b39-b6db-ad0a9536ebe4/glance-httpd/0.log" Jan 28 19:58:36 crc kubenswrapper[4749]: I0128 19:58:36.038430 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-758976cb66-pthtk_8a13bcdb-3c7e-487b-90a5-5f794941eb5d/heat-api/0.log" Jan 28 19:58:36 crc kubenswrapper[4749]: I0128 19:58:36.071416 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5667478cf5-vcwtt_b7c300e5-3d03-481e-bd44-85d8f7d44e72/heat-engine/0.log" Jan 28 19:58:36 crc kubenswrapper[4749]: I0128 19:58:36.276615 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7b886dbf44-fw2gl_df71ee0d-ef89-4e80-807c-2751810bca99/heat-cfnapi/0.log" Jan 28 19:58:36 crc kubenswrapper[4749]: I0128 19:58:36.798201 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29493781-5zxzd_60b19a5a-17c1-4990-af7b-f3636f9e1cd2/keystone-cron/0.log" Jan 28 19:58:36 crc kubenswrapper[4749]: I0128 19:58:36.935117 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-54bd464d95-gdqhz_0ad4c5d7-7f28-4a51-9f19-c5cb429f8fae/keystone-api/0.log" Jan 28 19:58:37 crc kubenswrapper[4749]: I0128 19:58:37.029733 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0229267a-385f-45ad-b285-2bb4be2c328d/kube-state-metrics/0.log" Jan 28 19:58:37 crc kubenswrapper[4749]: I0128 19:58:37.108122 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_5bd884b8-8914-47d0-b1f7-d85fc200ae9e/mysqld-exporter/0.log" Jan 28 19:58:37 crc kubenswrapper[4749]: I0128 19:58:37.366152 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68fb6b79f7-lxql7_5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7/neutron-api/0.log" Jan 28 19:58:37 crc kubenswrapper[4749]: I0128 19:58:37.390292 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68fb6b79f7-lxql7_5e1f6c01-89b9-4d8e-b8a5-57f18766ddd7/neutron-httpd/0.log" Jan 28 19:58:37 crc kubenswrapper[4749]: I0128 19:58:37.851563 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_31f23a26-7126-4b08-9616-49b0541bff3c/nova-api-log/0.log" Jan 28 19:58:37 crc kubenswrapper[4749]: I0128 19:58:37.855852 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_77653c2e-ff64-433d-aa7a-64d8dcab8eca/nova-cell0-conductor-conductor/0.log" Jan 28 19:58:38 crc kubenswrapper[4749]: I0128 19:58:38.068109 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_31f23a26-7126-4b08-9616-49b0541bff3c/nova-api-api/0.log" Jan 28 19:58:38 crc kubenswrapper[4749]: I0128 19:58:38.216545 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_81127c65-0c78-442f-83b3-1a060a1a0452/nova-cell1-conductor-conductor/0.log" Jan 28 19:58:38 crc kubenswrapper[4749]: I0128 19:58:38.265998 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_95cae223-462c-417b-b24e-34fb9e61b186/nova-cell1-novncproxy-novncproxy/0.log" Jan 28 19:58:38 crc kubenswrapper[4749]: I0128 19:58:38.427213 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8f9fb94c-26de-427c-bd82-919875c80787/nova-metadata-log/0.log" Jan 28 19:58:38 crc kubenswrapper[4749]: I0128 19:58:38.726544 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e/mysql-bootstrap/0.log" Jan 28 19:58:38 crc kubenswrapper[4749]: I0128 19:58:38.734675 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_62d97098-ddeb-4da3-9c51-2b7eb9a5cd4e/nova-scheduler-scheduler/0.log" Jan 28 19:58:38 crc kubenswrapper[4749]: I0128 19:58:38.987965 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e/mysql-bootstrap/0.log" Jan 28 19:58:39 crc kubenswrapper[4749]: I0128 19:58:39.012618 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7fa32822-e9c9-4eb5-8a9b-d8f0ef76177e/galera/0.log" Jan 28 19:58:39 crc kubenswrapper[4749]: I0128 19:58:39.215051 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6c01d945-dccd-468e-b6ce-d269f1715462/mysql-bootstrap/0.log" Jan 28 19:58:39 crc kubenswrapper[4749]: I0128 19:58:39.383515 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6c01d945-dccd-468e-b6ce-d269f1715462/mysql-bootstrap/0.log" Jan 28 19:58:39 crc kubenswrapper[4749]: I0128 19:58:39.467818 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6c01d945-dccd-468e-b6ce-d269f1715462/galera/0.log" Jan 28 19:58:39 crc kubenswrapper[4749]: I0128 19:58:39.594148 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1b70372a-0994-4bb9-8369-7b00699ee7c0/openstackclient/0.log" Jan 28 19:58:39 crc kubenswrapper[4749]: I0128 19:58:39.754110 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-g8gw7_ca89079d-f8be-4e31-a78d-1c4257260a8f/ovn-controller/0.log" Jan 28 19:58:39 crc kubenswrapper[4749]: I0128 19:58:39.907950 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nlxrf_43a4085d-f42a-4a25-8a00-8cfc6c771821/openstack-network-exporter/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.054448 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7z7wf_94c11021-07ce-4db3-ae6d-19ff89737e77/ovsdb-server-init/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.293895 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7z7wf_94c11021-07ce-4db3-ae6d-19ff89737e77/ovsdb-server-init/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.304445 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7z7wf_94c11021-07ce-4db3-ae6d-19ff89737e77/ovsdb-server/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.314560 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7z7wf_94c11021-07ce-4db3-ae6d-19ff89737e77/ovs-vswitchd/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.336523 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8f9fb94c-26de-427c-bd82-919875c80787/nova-metadata-metadata/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.516577 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1f41ddc7-edf9-4de5-9e85-3a7ec97de75f/ovn-northd/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.538207 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1f41ddc7-edf9-4de5-9e85-3a7ec97de75f/openstack-network-exporter/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.703542 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_886e17a6-463f-46b1-a745-5420d806f7e1/openstack-network-exporter/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.801322 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_886e17a6-463f-46b1-a745-5420d806f7e1/ovsdbserver-nb/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.927193 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5279af8d-ea25-418b-870f-71308c403b44/openstack-network-exporter/0.log" Jan 28 19:58:40 crc kubenswrapper[4749]: I0128 19:58:40.936967 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5279af8d-ea25-418b-870f-71308c403b44/ovsdbserver-sb/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.145870 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7699ff9496-twrqx_190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1/placement-api/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.194773 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7699ff9496-twrqx_190cd8ef-6d4f-4bc8-bdd6-4956a0d5d1f1/placement-log/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.293518 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3/init-config-reloader/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.485455 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3/config-reloader/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.495938 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3/init-config-reloader/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.561908 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3/prometheus/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.599860 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_56b79ef2-2b8d-486b-a757-a1a5b7dc8ee3/thanos-sidecar/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.735239 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5954ab85-e42a-498a-ae91-fd46445c0860/setup-container/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.903590 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5954ab85-e42a-498a-ae91-fd46445c0860/setup-container/0.log" Jan 28 19:58:41 crc kubenswrapper[4749]: I0128 19:58:41.986085 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5954ab85-e42a-498a-ae91-fd46445c0860/rabbitmq/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.036880 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_19a12543-c0a3-486f-b5bd-4f2862c15a37/setup-container/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.232673 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_19a12543-c0a3-486f-b5bd-4f2862c15a37/setup-container/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.277959 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_19a12543-c0a3-486f-b5bd-4f2862c15a37/rabbitmq/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.326248 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_94850daa-65af-4e6a-ad29-cfa28c3076e7/setup-container/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.536211 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_94850daa-65af-4e6a-ad29-cfa28c3076e7/setup-container/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.543421 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_94850daa-65af-4e6a-ad29-cfa28c3076e7/rabbitmq/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.571990 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_772600b2-9086-4d72-bb86-6edfb0a21b35/setup-container/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.854005 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_772600b2-9086-4d72-bb86-6edfb0a21b35/setup-container/0.log" Jan 28 19:58:42 crc kubenswrapper[4749]: I0128 19:58:42.878821 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_772600b2-9086-4d72-bb86-6edfb0a21b35/rabbitmq/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.071395 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-694b5876b5-phgqj_4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5/proxy-server/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.116613 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mhwkd_26e02b25-1356-40b3-b33a-947082d120e0/swift-ring-rebalance/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.162598 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-694b5876b5-phgqj_4dbc1f06-29ff-40cd-ab22-ac8df4bff0c5/proxy-httpd/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.376845 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/account-auditor/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.436855 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/account-replicator/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.474145 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/account-reaper/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.601619 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/container-auditor/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.615712 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/account-server/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.698715 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/container-server/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.730466 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/container-replicator/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.822978 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/container-updater/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.907049 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/object-auditor/0.log" Jan 28 19:58:43 crc kubenswrapper[4749]: I0128 19:58:43.948041 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/object-expirer/0.log" Jan 28 19:58:44 crc kubenswrapper[4749]: I0128 19:58:44.018451 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/object-replicator/0.log" Jan 28 19:58:44 crc kubenswrapper[4749]: I0128 19:58:44.083724 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/object-server/0.log" Jan 28 19:58:44 crc kubenswrapper[4749]: I0128 19:58:44.110609 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/object-updater/0.log" Jan 28 19:58:44 crc kubenswrapper[4749]: I0128 19:58:44.154780 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/rsync/0.log" Jan 28 19:58:44 crc kubenswrapper[4749]: I0128 19:58:44.263515 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1992d04-5d9f-498f-bee7-f2ab001feb76/swift-recon-cron/0.log" Jan 28 19:58:53 crc kubenswrapper[4749]: I0128 19:58:53.537502 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1c85419b-39a8-4c29-bff0-475dfc988a32/memcached/0.log" Jan 28 19:58:57 crc kubenswrapper[4749]: I0128 19:58:57.467292 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:58:57 crc kubenswrapper[4749]: I0128 19:58:57.467915 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:59:15 crc kubenswrapper[4749]: I0128 19:59:15.366978 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr_608c8370-fa8f-40c3-8fe2-0ee96449e671/util/0.log" Jan 28 19:59:15 crc kubenswrapper[4749]: I0128 19:59:15.578194 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr_608c8370-fa8f-40c3-8fe2-0ee96449e671/util/0.log" Jan 28 19:59:15 crc kubenswrapper[4749]: I0128 19:59:15.647376 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr_608c8370-fa8f-40c3-8fe2-0ee96449e671/pull/0.log" Jan 28 19:59:15 crc kubenswrapper[4749]: I0128 19:59:15.649581 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr_608c8370-fa8f-40c3-8fe2-0ee96449e671/pull/0.log" Jan 28 19:59:15 crc kubenswrapper[4749]: I0128 19:59:15.823778 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr_608c8370-fa8f-40c3-8fe2-0ee96449e671/util/0.log" Jan 28 19:59:15 crc kubenswrapper[4749]: I0128 19:59:15.959858 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr_608c8370-fa8f-40c3-8fe2-0ee96449e671/pull/0.log" Jan 28 19:59:15 crc kubenswrapper[4749]: I0128 19:59:15.971604 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ff8d4f1d1a2b9f947c76cb0859a1bcccc65b2094fb24a26aaef048927vfvzr_608c8370-fa8f-40c3-8fe2-0ee96449e671/extract/0.log" Jan 28 19:59:16 crc kubenswrapper[4749]: I0128 19:59:16.127779 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6bc7f4f4cf-7rcdb_b606ed73-e992-4755-ac52-4ace6b8b553c/manager/0.log" Jan 28 19:59:16 crc kubenswrapper[4749]: I0128 19:59:16.363661 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-f6487bd57-5ztsh_b553f796-28d7-455e-92bb-05bbc30a9a27/manager/0.log" Jan 28 19:59:16 crc kubenswrapper[4749]: I0128 19:59:16.421508 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66dfbd6f5d-n667g_b99425c5-bab7-48bb-a341-e1c160eac631/manager/0.log" Jan 28 19:59:16 crc kubenswrapper[4749]: I0128 19:59:16.705700 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-6db5dbd896-q7r57_8e193a26-a0a4-48e1-a3bf-13b52f809e3e/manager/0.log" Jan 28 19:59:16 crc kubenswrapper[4749]: I0128 19:59:16.721879 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-587c6bfdcf-zbk86_ce60ace7-ac44-45a5-9422-dade4b147417/manager/0.log" Jan 28 19:59:16 crc kubenswrapper[4749]: I0128 19:59:16.857519 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-g9cw9_aeea4bd2-a2a5-451b-9a40-02881d5901a0/manager/0.log" Jan 28 19:59:17 crc kubenswrapper[4749]: I0128 19:59:17.211138 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-958664b5-2x98l_9749144e-8c20-470f-97f1-c450d9520c07/manager/0.log" Jan 28 19:59:17 crc kubenswrapper[4749]: I0128 19:59:17.219411 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-4t7fn_76e8a2e5-b9cd-4114-af08-d767fb2ae45d/manager/0.log" Jan 28 19:59:17 crc kubenswrapper[4749]: I0128 19:59:17.317465 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6978b79747-mgpfr_d08f8b17-02c1-4000-a5a5-0e53b209472c/manager/0.log" Jan 28 19:59:17 crc kubenswrapper[4749]: I0128 19:59:17.470724 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-765668569f-p2dr2_b3ba0ea4-04ca-478e-ae2c-ebcbf3353362/manager/0.log" Jan 28 19:59:17 crc kubenswrapper[4749]: I0128 19:59:17.528643 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-npbds_ef2c3378-a1f3-4366-be22-66e12e71dcb2/manager/0.log" Jan 28 19:59:17 crc kubenswrapper[4749]: I0128 19:59:17.780940 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-694c5bfc85-cl2lx_4c6ce987-6e7a-4c86-9269-115e16d51c4d/manager/0.log" Jan 28 19:59:17 crc kubenswrapper[4749]: I0128 19:59:17.894024 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-ddcbfd695-kq24f_bbd77250-3bcc-4f6f-acc3-4e05f1b01c7b/manager/0.log" Jan 28 19:59:18 crc kubenswrapper[4749]: I0128 19:59:18.047301 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5c765b4558-hgs42_b52dfbde-2120-4054-974f-5992f89c9811/manager/0.log" Jan 28 19:59:18 crc kubenswrapper[4749]: I0128 19:59:18.109431 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4d6clds_c5b52880-267a-4211-a94d-09d132976cf3/manager/0.log" Jan 28 19:59:18 crc kubenswrapper[4749]: I0128 19:59:18.381526 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-858cbdb9cd-xhmkn_147f1b91-3404-40b5-9a8b-7799ab71fadf/operator/0.log" Jan 28 19:59:18 crc kubenswrapper[4749]: I0128 19:59:18.965877 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tjghg_0c39e330-6c54-46c8-a4f7-529871b845db/registry-server/0.log" Jan 28 19:59:19 crc kubenswrapper[4749]: I0128 19:59:19.178032 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-rbvsr_c102c3d5-9654-48bf-be4f-6cb41b1f8d7a/manager/0.log" Jan 28 19:59:19 crc kubenswrapper[4749]: I0128 19:59:19.369948 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-798d8549d8-9kb8p_5ec9e008-61f0-4635-8e4c-2ae6fe098fc3/manager/0.log" Jan 28 19:59:19 crc kubenswrapper[4749]: I0128 19:59:19.414689 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-pqzkn_a607ca6c-b867-42b6-b9ad-d5941671b685/manager/0.log" Jan 28 19:59:19 crc kubenswrapper[4749]: I0128 19:59:19.642880 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ssdhs_0f56d027-e393-4583-ab10-8dc8c1046027/operator/0.log" Jan 28 19:59:19 crc kubenswrapper[4749]: I0128 19:59:19.741184 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-5szzt_66c3a9b0-5edc-468c-bf0e-309cac8be928/manager/0.log" Jan 28 19:59:20 crc kubenswrapper[4749]: I0128 19:59:20.001345 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-x28dk_8780976f-c471-4ba4-956c-33ea9089437f/manager/0.log" Jan 28 19:59:20 crc kubenswrapper[4749]: I0128 19:59:20.061307 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-877d65859-6skzm_87eb2e29-ea6e-4b59-9b4b-3d5c09acda4c/manager/0.log" Jan 28 19:59:20 crc kubenswrapper[4749]: I0128 19:59:20.088150 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-767b8bc766-8kcjp_cd842919-01b8-4886-914e-2eb9e731f65f/manager/0.log" Jan 28 19:59:27 crc kubenswrapper[4749]: I0128 19:59:27.468194 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 19:59:27 crc kubenswrapper[4749]: I0128 19:59:27.468738 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 19:59:27 crc kubenswrapper[4749]: I0128 19:59:27.468787 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 19:59:27 crc kubenswrapper[4749]: I0128 19:59:27.469920 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 19:59:27 crc kubenswrapper[4749]: I0128 19:59:27.469972 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" gracePeriod=600 Jan 28 19:59:27 crc kubenswrapper[4749]: E0128 19:59:27.605131 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:59:28 crc kubenswrapper[4749]: I0128 19:59:28.437105 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" exitCode=0 Jan 28 19:59:28 crc kubenswrapper[4749]: I0128 19:59:28.437179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911"} Jan 28 19:59:28 crc kubenswrapper[4749]: I0128 19:59:28.437452 4749 scope.go:117] "RemoveContainer" containerID="235aa65ffdcac809260e67b0759c91f4859ee458a18b8ff92e354cd57d7184b7" Jan 28 19:59:28 crc kubenswrapper[4749]: I0128 19:59:28.438237 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 19:59:28 crc kubenswrapper[4749]: E0128 19:59:28.438558 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:59:39 crc kubenswrapper[4749]: I0128 19:59:39.872278 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 19:59:39 crc kubenswrapper[4749]: E0128 19:59:39.873917 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:59:43 crc kubenswrapper[4749]: I0128 19:59:43.804012 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nr5b7_2e43ff22-2928-48b1-b985-27926bcd5ef8/control-plane-machine-set-operator/0.log" Jan 28 19:59:44 crc kubenswrapper[4749]: I0128 19:59:44.030704 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7mg8d_ae92b3db-2f69-410c-9cb0-6383fe6343ba/kube-rbac-proxy/0.log" Jan 28 19:59:44 crc kubenswrapper[4749]: I0128 19:59:44.055540 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7mg8d_ae92b3db-2f69-410c-9cb0-6383fe6343ba/machine-api-operator/0.log" Jan 28 19:59:53 crc kubenswrapper[4749]: I0128 19:59:53.871696 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 19:59:53 crc kubenswrapper[4749]: E0128 19:59:53.872526 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 19:59:59 crc kubenswrapper[4749]: I0128 19:59:59.461161 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-h9mw4_ffb56991-41fd-4ca3-9aca-5577ff399534/cert-manager-controller/0.log" Jan 28 19:59:59 crc kubenswrapper[4749]: I0128 19:59:59.621012 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-rdfhh_3b3b2f85-7b19-4bc6-8618-885d114ed3d3/cert-manager-cainjector/0.log" Jan 28 19:59:59 crc kubenswrapper[4749]: I0128 19:59:59.675777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-mhc52_9cdb06ed-9c63-4f38-9276-42339904fdd0/cert-manager-webhook/0.log" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.146597 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w"] Jan 28 20:00:00 crc kubenswrapper[4749]: E0128 20:00:00.147525 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4ae246-00a0-41b3-aba8-d367d12f9796" containerName="container-00" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.147561 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4ae246-00a0-41b3-aba8-d367d12f9796" containerName="container-00" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.147800 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4ae246-00a0-41b3-aba8-d367d12f9796" containerName="container-00" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.148655 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.150490 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.155104 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.160122 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w"] Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.291983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-secret-volume\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.292037 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pcfs\" (UniqueName: \"kubernetes.io/projected/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-kube-api-access-2pcfs\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.292374 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-config-volume\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.394923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-config-volume\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.395151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-secret-volume\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.395182 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pcfs\" (UniqueName: \"kubernetes.io/projected/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-kube-api-access-2pcfs\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.395967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-config-volume\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.402154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-secret-volume\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.416699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pcfs\" (UniqueName: \"kubernetes.io/projected/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-kube-api-access-2pcfs\") pod \"collect-profiles-29493840-xwj5w\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:00 crc kubenswrapper[4749]: I0128 20:00:00.472564 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:01 crc kubenswrapper[4749]: I0128 20:00:01.083265 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w"] Jan 28 20:00:01 crc kubenswrapper[4749]: W0128 20:00:01.085567 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podada9afa5_4cbe_41f6_8f80_17e06f1caf36.slice/crio-d2d4dc8d53bcada8cb7320ed12225e3178b33f94f84133a1061ecafb751970a5 WatchSource:0}: Error finding container d2d4dc8d53bcada8cb7320ed12225e3178b33f94f84133a1061ecafb751970a5: Status 404 returned error can't find the container with id d2d4dc8d53bcada8cb7320ed12225e3178b33f94f84133a1061ecafb751970a5 Jan 28 20:00:01 crc kubenswrapper[4749]: I0128 20:00:01.252811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" event={"ID":"ada9afa5-4cbe-41f6-8f80-17e06f1caf36","Type":"ContainerStarted","Data":"d2d4dc8d53bcada8cb7320ed12225e3178b33f94f84133a1061ecafb751970a5"} Jan 28 20:00:02 crc kubenswrapper[4749]: I0128 20:00:02.264194 4749 generic.go:334] "Generic (PLEG): container finished" podID="ada9afa5-4cbe-41f6-8f80-17e06f1caf36" containerID="8250d72cb9f5875057d08d0b4525596114a45fabb67eae48322f96f379e41db1" exitCode=0 Jan 28 20:00:02 crc kubenswrapper[4749]: I0128 20:00:02.264268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" event={"ID":"ada9afa5-4cbe-41f6-8f80-17e06f1caf36","Type":"ContainerDied","Data":"8250d72cb9f5875057d08d0b4525596114a45fabb67eae48322f96f379e41db1"} Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.690020 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.787564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pcfs\" (UniqueName: \"kubernetes.io/projected/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-kube-api-access-2pcfs\") pod \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.787778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-config-volume\") pod \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.788466 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-config-volume" (OuterVolumeSpecName: "config-volume") pod "ada9afa5-4cbe-41f6-8f80-17e06f1caf36" (UID: "ada9afa5-4cbe-41f6-8f80-17e06f1caf36"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.788702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-secret-volume\") pod \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\" (UID: \"ada9afa5-4cbe-41f6-8f80-17e06f1caf36\") " Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.789518 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.794627 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ada9afa5-4cbe-41f6-8f80-17e06f1caf36" (UID: "ada9afa5-4cbe-41f6-8f80-17e06f1caf36"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.794740 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-kube-api-access-2pcfs" (OuterVolumeSpecName: "kube-api-access-2pcfs") pod "ada9afa5-4cbe-41f6-8f80-17e06f1caf36" (UID: "ada9afa5-4cbe-41f6-8f80-17e06f1caf36"). InnerVolumeSpecName "kube-api-access-2pcfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.891762 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pcfs\" (UniqueName: \"kubernetes.io/projected/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-kube-api-access-2pcfs\") on node \"crc\" DevicePath \"\"" Jan 28 20:00:03 crc kubenswrapper[4749]: I0128 20:00:03.891802 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ada9afa5-4cbe-41f6-8f80-17e06f1caf36-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 20:00:04 crc kubenswrapper[4749]: I0128 20:00:04.291784 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" Jan 28 20:00:04 crc kubenswrapper[4749]: I0128 20:00:04.292127 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493840-xwj5w" event={"ID":"ada9afa5-4cbe-41f6-8f80-17e06f1caf36","Type":"ContainerDied","Data":"d2d4dc8d53bcada8cb7320ed12225e3178b33f94f84133a1061ecafb751970a5"} Jan 28 20:00:04 crc kubenswrapper[4749]: I0128 20:00:04.292153 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2d4dc8d53bcada8cb7320ed12225e3178b33f94f84133a1061ecafb751970a5" Jan 28 20:00:04 crc kubenswrapper[4749]: I0128 20:00:04.779659 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs"] Jan 28 20:00:04 crc kubenswrapper[4749]: I0128 20:00:04.790232 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493795-4zcfs"] Jan 28 20:00:04 crc kubenswrapper[4749]: I0128 20:00:04.871683 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:00:04 crc kubenswrapper[4749]: E0128 20:00:04.872202 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:00:04 crc kubenswrapper[4749]: I0128 20:00:04.887033 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd0f6520-a360-40d0-a6aa-cd2775f09d94" path="/var/lib/kubelet/pods/cd0f6520-a360-40d0-a6aa-cd2775f09d94/volumes" Jan 28 20:00:14 crc kubenswrapper[4749]: I0128 20:00:14.325362 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-mbmx8_5c06f0b6-f78b-44cb-8305-c4ac809711e8/nmstate-console-plugin/0.log" Jan 28 20:00:14 crc kubenswrapper[4749]: I0128 20:00:14.483319 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zfv4r_61da7503-b316-46f6-8ea1-2ee05d142a2f/nmstate-handler/0.log" Jan 28 20:00:14 crc kubenswrapper[4749]: I0128 20:00:14.564824 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-cd7b4_89bc2633-4343-4bc9-9738-38c95e83dccd/kube-rbac-proxy/0.log" Jan 28 20:00:14 crc kubenswrapper[4749]: I0128 20:00:14.669920 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-cd7b4_89bc2633-4343-4bc9-9738-38c95e83dccd/nmstate-metrics/0.log" Jan 28 20:00:14 crc kubenswrapper[4749]: I0128 20:00:14.807433 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-kb6l2_ed1a865e-60b5-4988-aece-9badf4d94f37/nmstate-operator/0.log" Jan 28 20:00:14 crc kubenswrapper[4749]: I0128 20:00:14.891880 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-lvpx5_ed9849a0-0d35-4788-8987-a08e79096f9b/nmstate-webhook/0.log" Jan 28 20:00:15 crc kubenswrapper[4749]: I0128 20:00:15.872309 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:00:15 crc kubenswrapper[4749]: E0128 20:00:15.874241 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:00:29 crc kubenswrapper[4749]: I0128 20:00:29.663361 4749 scope.go:117] "RemoveContainer" containerID="a5cdfa449272e7041b744cab312a5929dd822a46450235edb04fa6426768b3ab" Jan 28 20:00:30 crc kubenswrapper[4749]: I0128 20:00:30.871872 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:00:30 crc kubenswrapper[4749]: E0128 20:00:30.872725 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:00:31 crc kubenswrapper[4749]: I0128 20:00:31.347540 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-76ff55d55d-mdgpw_5ee54d7a-a2c0-4238-be42-ba0843c776ef/manager/0.log" Jan 28 20:00:31 crc kubenswrapper[4749]: I0128 20:00:31.349073 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-76ff55d55d-mdgpw_5ee54d7a-a2c0-4238-be42-ba0843c776ef/kube-rbac-proxy/0.log" Jan 28 20:00:44 crc kubenswrapper[4749]: I0128 20:00:44.871628 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:00:44 crc kubenswrapper[4749]: E0128 20:00:44.872563 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:00:45 crc kubenswrapper[4749]: I0128 20:00:45.955905 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-hmgb7_2f8457f9-3010-4acc-88d1-97e5bec85c2c/prometheus-operator/0.log" Jan 28 20:00:46 crc kubenswrapper[4749]: I0128 20:00:46.220232 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_4f02f0f9-aa18-4986-af31-25b776f67fb7/prometheus-operator-admission-webhook/0.log" Jan 28 20:00:46 crc kubenswrapper[4749]: I0128 20:00:46.260629 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_84cf86f4-9829-41b0-8151-028ef75f861e/prometheus-operator-admission-webhook/0.log" Jan 28 20:00:46 crc kubenswrapper[4749]: I0128 20:00:46.382417 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-4kwb4_502b92eb-ac87-456c-933a-7e9ff562e326/operator/0.log" Jan 28 20:00:46 crc kubenswrapper[4749]: I0128 20:00:46.473683 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-9lj2z_5d090067-e17c-4708-8d06-23dd0e9da1dc/observability-ui-dashboards/0.log" Jan 28 20:00:46 crc kubenswrapper[4749]: I0128 20:00:46.593580 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-c6vlq_45c3f22c-a523-4e94-858c-97bdb2705b9e/perses-operator/0.log" Jan 28 20:00:56 crc kubenswrapper[4749]: I0128 20:00:56.872064 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:00:56 crc kubenswrapper[4749]: E0128 20:00:56.872977 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.156186 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29493841-sdblf"] Jan 28 20:01:00 crc kubenswrapper[4749]: E0128 20:01:00.157506 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9afa5-4cbe-41f6-8f80-17e06f1caf36" containerName="collect-profiles" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.157526 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9afa5-4cbe-41f6-8f80-17e06f1caf36" containerName="collect-profiles" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.157804 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9afa5-4cbe-41f6-8f80-17e06f1caf36" containerName="collect-profiles" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.158708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.168198 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29493841-sdblf"] Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.336494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-fernet-keys\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.337007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-config-data\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.337101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt4j7\" (UniqueName: \"kubernetes.io/projected/eb73c82e-f8c3-4bae-be63-63cd103bee2b-kube-api-access-tt4j7\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.337265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-combined-ca-bundle\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.439740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt4j7\" (UniqueName: \"kubernetes.io/projected/eb73c82e-f8c3-4bae-be63-63cd103bee2b-kube-api-access-tt4j7\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.439835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-combined-ca-bundle\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.439911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-fernet-keys\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.440006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-config-data\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.758004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-combined-ca-bundle\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.760664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-fernet-keys\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.760771 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt4j7\" (UniqueName: \"kubernetes.io/projected/eb73c82e-f8c3-4bae-be63-63cd103bee2b-kube-api-access-tt4j7\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.764223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-config-data\") pod \"keystone-cron-29493841-sdblf\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:00 crc kubenswrapper[4749]: I0128 20:01:00.792149 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:01 crc kubenswrapper[4749]: I0128 20:01:01.293825 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29493841-sdblf"] Jan 28 20:01:01 crc kubenswrapper[4749]: I0128 20:01:01.870674 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493841-sdblf" event={"ID":"eb73c82e-f8c3-4bae-be63-63cd103bee2b","Type":"ContainerStarted","Data":"82590937e8fd8a5e42c93750f3b2d0c78201831e6ec718f68846ad1cc5327281"} Jan 28 20:01:01 crc kubenswrapper[4749]: I0128 20:01:01.871056 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493841-sdblf" event={"ID":"eb73c82e-f8c3-4bae-be63-63cd103bee2b","Type":"ContainerStarted","Data":"95e43c774728d46401166fe32c1d5523a3100dc4a597d50b0cd89af67e3c5680"} Jan 28 20:01:01 crc kubenswrapper[4749]: I0128 20:01:01.892558 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29493841-sdblf" podStartSLOduration=1.892536154 podStartE2EDuration="1.892536154s" podCreationTimestamp="2026-01-28 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:01:01.888347152 +0000 UTC m=+5129.899873927" watchObservedRunningTime="2026-01-28 20:01:01.892536154 +0000 UTC m=+5129.904062929" Jan 28 20:01:05 crc kubenswrapper[4749]: I0128 20:01:05.914778 4749 generic.go:334] "Generic (PLEG): container finished" podID="eb73c82e-f8c3-4bae-be63-63cd103bee2b" containerID="82590937e8fd8a5e42c93750f3b2d0c78201831e6ec718f68846ad1cc5327281" exitCode=0 Jan 28 20:01:05 crc kubenswrapper[4749]: I0128 20:01:05.914925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493841-sdblf" event={"ID":"eb73c82e-f8c3-4bae-be63-63cd103bee2b","Type":"ContainerDied","Data":"82590937e8fd8a5e42c93750f3b2d0c78201831e6ec718f68846ad1cc5327281"} Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.001830 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-79cf69ddc8-hf99g_c15f1eca-3fcf-4c2c-b15a-99f1ee75b6b3/cluster-logging-operator/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.159521 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-6gh4z_52382cc0-233b-441a-b56f-24c303211d2a/collector/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.345153 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_39f97e52-cfdc-487c-b88a-a315c4c4d651/loki-compactor/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.369619 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5f678c8dd6-qgkbc_eb71e057-ee31-4787-a3e4-d58815f9923e/loki-distributor/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.558541 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5b6db5567f-7qbgw_ca507b0e-375b-47b0-bd5e-c77f2bc7d521/opa/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.582975 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5b6db5567f-7qbgw_ca507b0e-375b-47b0-bd5e-c77f2bc7d521/gateway/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.698694 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5b6db5567f-9vjtq_cdcc709f-ccfa-4927-a1a3-333e7f810817/gateway/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.738376 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5b6db5567f-9vjtq_cdcc709f-ccfa-4927-a1a3-333e7f810817/opa/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.862269 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_cb1f1961-f7f5-4c47-93f9-7f06ac02b45c/loki-index-gateway/0.log" Jan 28 20:01:06 crc kubenswrapper[4749]: I0128 20:01:06.984763 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_74be48ce-557a-4f08-931e-5f222458fbe3/loki-ingester/0.log" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.102553 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76788598db-qphbq_84872132-539a-41fc-9345-06b105caae61/loki-querier/0.log" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.230729 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-69d9546745-cfwmd_745b517a-16b6-4319-a44d-8976b8659a23/loki-query-frontend/0.log" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.335765 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.398465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-fernet-keys\") pod \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.398809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-combined-ca-bundle\") pod \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.398943 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt4j7\" (UniqueName: \"kubernetes.io/projected/eb73c82e-f8c3-4bae-be63-63cd103bee2b-kube-api-access-tt4j7\") pod \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.398972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-config-data\") pod \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\" (UID: \"eb73c82e-f8c3-4bae-be63-63cd103bee2b\") " Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.407953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "eb73c82e-f8c3-4bae-be63-63cd103bee2b" (UID: "eb73c82e-f8c3-4bae-be63-63cd103bee2b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.413696 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb73c82e-f8c3-4bae-be63-63cd103bee2b-kube-api-access-tt4j7" (OuterVolumeSpecName: "kube-api-access-tt4j7") pod "eb73c82e-f8c3-4bae-be63-63cd103bee2b" (UID: "eb73c82e-f8c3-4bae-be63-63cd103bee2b"). InnerVolumeSpecName "kube-api-access-tt4j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.431863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb73c82e-f8c3-4bae-be63-63cd103bee2b" (UID: "eb73c82e-f8c3-4bae-be63-63cd103bee2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.494624 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-config-data" (OuterVolumeSpecName: "config-data") pod "eb73c82e-f8c3-4bae-be63-63cd103bee2b" (UID: "eb73c82e-f8c3-4bae-be63-63cd103bee2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.501587 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt4j7\" (UniqueName: \"kubernetes.io/projected/eb73c82e-f8c3-4bae-be63-63cd103bee2b-kube-api-access-tt4j7\") on node \"crc\" DevicePath \"\"" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.501633 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.501646 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.501659 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c82e-f8c3-4bae-be63-63cd103bee2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.933200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493841-sdblf" event={"ID":"eb73c82e-f8c3-4bae-be63-63cd103bee2b","Type":"ContainerDied","Data":"95e43c774728d46401166fe32c1d5523a3100dc4a597d50b0cd89af67e3c5680"} Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.933238 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95e43c774728d46401166fe32c1d5523a3100dc4a597d50b0cd89af67e3c5680" Jan 28 20:01:07 crc kubenswrapper[4749]: I0128 20:01:07.933251 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493841-sdblf" Jan 28 20:01:11 crc kubenswrapper[4749]: I0128 20:01:11.873674 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:01:11 crc kubenswrapper[4749]: E0128 20:01:11.874896 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:01:23 crc kubenswrapper[4749]: I0128 20:01:23.871645 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:01:23 crc kubenswrapper[4749]: E0128 20:01:23.872701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:01:25 crc kubenswrapper[4749]: I0128 20:01:25.452613 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-wd4x6_4623a06f-773f-4e09-aaf7-8dcbc11cadd4/kube-rbac-proxy/0.log" Jan 28 20:01:25 crc kubenswrapper[4749]: I0128 20:01:25.679467 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-wd4x6_4623a06f-773f-4e09-aaf7-8dcbc11cadd4/controller/0.log" Jan 28 20:01:25 crc kubenswrapper[4749]: I0128 20:01:25.727392 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-frr-files/0.log" Jan 28 20:01:25 crc kubenswrapper[4749]: I0128 20:01:25.933974 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-reloader/0.log" Jan 28 20:01:25 crc kubenswrapper[4749]: I0128 20:01:25.971164 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-frr-files/0.log" Jan 28 20:01:25 crc kubenswrapper[4749]: I0128 20:01:25.971347 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-metrics/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.007587 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-reloader/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.142586 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-metrics/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.191160 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-frr-files/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.199960 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-reloader/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.211983 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-metrics/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.419391 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-frr-files/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.427577 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-metrics/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.469488 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/controller/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.477264 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/cp-reloader/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.613406 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/frr-metrics/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.684486 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/kube-rbac-proxy-frr/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.709076 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/kube-rbac-proxy/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.888137 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/reloader/0.log" Jan 28 20:01:26 crc kubenswrapper[4749]: I0128 20:01:26.987449 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-q6rrx_e8c5ee0f-766a-4f21-9243-1e926fd8ebb0/frr-k8s-webhook-server/0.log" Jan 28 20:01:27 crc kubenswrapper[4749]: I0128 20:01:27.181272 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75b778c54b-888q2_2368696d-a2cf-4a26-a7be-43981bf04f66/manager/0.log" Jan 28 20:01:27 crc kubenswrapper[4749]: I0128 20:01:27.345939 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79bbd885c9-dtmgc_5b4b0e51-a33b-4432-98be-be91474370c0/webhook-server/0.log" Jan 28 20:01:27 crc kubenswrapper[4749]: I0128 20:01:27.467677 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8szhl_f3e87fb6-163a-4024-b3c6-f20d528fd58f/kube-rbac-proxy/0.log" Jan 28 20:01:28 crc kubenswrapper[4749]: I0128 20:01:28.086973 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8szhl_f3e87fb6-163a-4024-b3c6-f20d528fd58f/speaker/0.log" Jan 28 20:01:28 crc kubenswrapper[4749]: I0128 20:01:28.132119 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5xs5q_d597ae59-efbc-48b1-9f21-51ead30e9812/frr/0.log" Jan 28 20:01:35 crc kubenswrapper[4749]: I0128 20:01:35.871597 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:01:35 crc kubenswrapper[4749]: E0128 20:01:35.872473 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:01:40 crc kubenswrapper[4749]: I0128 20:01:40.867367 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89_b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b/util/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.025777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89_b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b/util/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.066985 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89_b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b/pull/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.102775 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89_b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b/pull/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.225808 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89_b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b/util/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.241983 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89_b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b/extract/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.277563 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2cbh89_b553de9c-6ef6-48a5-bb5b-c6dd0f12f18b/pull/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.407502 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm_9df443ae-e4fd-4f9c-ba5f-99668ef3e451/util/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.635419 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm_9df443ae-e4fd-4f9c-ba5f-99668ef3e451/pull/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.635486 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm_9df443ae-e4fd-4f9c-ba5f-99668ef3e451/pull/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.711429 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm_9df443ae-e4fd-4f9c-ba5f-99668ef3e451/util/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.829898 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm_9df443ae-e4fd-4f9c-ba5f-99668ef3e451/util/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.867835 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm_9df443ae-e4fd-4f9c-ba5f-99668ef3e451/pull/0.log" Jan 28 20:01:41 crc kubenswrapper[4749]: I0128 20:01:41.940831 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcphnjm_9df443ae-e4fd-4f9c-ba5f-99668ef3e451/extract/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.019209 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl_a6755cda-1468-4e58-a4b1-9b93b269ec5f/util/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.206061 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl_a6755cda-1468-4e58-a4b1-9b93b269ec5f/util/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.296031 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl_a6755cda-1468-4e58-a4b1-9b93b269ec5f/pull/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.305088 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl_a6755cda-1468-4e58-a4b1-9b93b269ec5f/pull/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.435816 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl_a6755cda-1468-4e58-a4b1-9b93b269ec5f/util/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.457722 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl_a6755cda-1468-4e58-a4b1-9b93b269ec5f/pull/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.519815 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b5qhkl_a6755cda-1468-4e58-a4b1-9b93b269ec5f/extract/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.661197 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r_fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2/util/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.856751 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r_fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2/pull/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.867651 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r_fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2/util/0.log" Jan 28 20:01:42 crc kubenswrapper[4749]: I0128 20:01:42.907389 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r_fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2/pull/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.071165 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r_fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2/util/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.072091 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r_fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2/pull/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.088980 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713g7j5r_fc6d7f14-257e-41dd-9e34-cdfaf74aa9c2/extract/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.228957 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw_1f22f1a6-aca7-47f1-9cf4-1218d6dd3931/util/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.417092 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw_1f22f1a6-aca7-47f1-9cf4-1218d6dd3931/util/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.425212 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw_1f22f1a6-aca7-47f1-9cf4-1218d6dd3931/pull/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.452208 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw_1f22f1a6-aca7-47f1-9cf4-1218d6dd3931/pull/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.628835 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw_1f22f1a6-aca7-47f1-9cf4-1218d6dd3931/util/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.680754 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw_1f22f1a6-aca7-47f1-9cf4-1218d6dd3931/extract/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.694789 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hrrlw_1f22f1a6-aca7-47f1-9cf4-1218d6dd3931/pull/0.log" Jan 28 20:01:43 crc kubenswrapper[4749]: I0128 20:01:43.776455 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mq9lg_8e5a0c35-40e6-424a-9a03-377de71895bb/extract-utilities/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.036929 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mq9lg_8e5a0c35-40e6-424a-9a03-377de71895bb/extract-content/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.037585 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mq9lg_8e5a0c35-40e6-424a-9a03-377de71895bb/extract-content/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.038727 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mq9lg_8e5a0c35-40e6-424a-9a03-377de71895bb/extract-utilities/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.188350 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mq9lg_8e5a0c35-40e6-424a-9a03-377de71895bb/extract-utilities/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.209156 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mq9lg_8e5a0c35-40e6-424a-9a03-377de71895bb/extract-content/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.375459 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8b6q2_12e7736f-045a-424e-92f9-5ac197561ff0/extract-utilities/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.618558 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8b6q2_12e7736f-045a-424e-92f9-5ac197561ff0/extract-utilities/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.666038 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8b6q2_12e7736f-045a-424e-92f9-5ac197561ff0/extract-content/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.667074 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8b6q2_12e7736f-045a-424e-92f9-5ac197561ff0/extract-content/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.921947 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8b6q2_12e7736f-045a-424e-92f9-5ac197561ff0/extract-content/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.923625 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8b6q2_12e7736f-045a-424e-92f9-5ac197561ff0/extract-utilities/0.log" Jan 28 20:01:44 crc kubenswrapper[4749]: I0128 20:01:44.971961 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mq9lg_8e5a0c35-40e6-424a-9a03-377de71895bb/registry-server/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.117700 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jzj9j_714ea987-0827-4494-9bee-eff5f2bb07b2/marketplace-operator/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.254397 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xlwht_118103e4-87d2-4431-bd0a-17fddfbbc497/extract-utilities/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.393201 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xlwht_118103e4-87d2-4431-bd0a-17fddfbbc497/extract-content/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.448675 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xlwht_118103e4-87d2-4431-bd0a-17fddfbbc497/extract-utilities/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.455457 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xlwht_118103e4-87d2-4431-bd0a-17fddfbbc497/extract-content/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.605725 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8b6q2_12e7736f-045a-424e-92f9-5ac197561ff0/registry-server/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.698145 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xlwht_118103e4-87d2-4431-bd0a-17fddfbbc497/extract-utilities/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.728799 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xlwht_118103e4-87d2-4431-bd0a-17fddfbbc497/extract-content/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.840433 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhkfd_b8bbe45a-affd-4b0f-81e1-dcf1db6b321a/extract-utilities/0.log" Jan 28 20:01:45 crc kubenswrapper[4749]: I0128 20:01:45.862818 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xlwht_118103e4-87d2-4431-bd0a-17fddfbbc497/registry-server/0.log" Jan 28 20:01:46 crc kubenswrapper[4749]: I0128 20:01:46.044707 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhkfd_b8bbe45a-affd-4b0f-81e1-dcf1db6b321a/extract-content/0.log" Jan 28 20:01:46 crc kubenswrapper[4749]: I0128 20:01:46.051006 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhkfd_b8bbe45a-affd-4b0f-81e1-dcf1db6b321a/extract-content/0.log" Jan 28 20:01:46 crc kubenswrapper[4749]: I0128 20:01:46.051312 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhkfd_b8bbe45a-affd-4b0f-81e1-dcf1db6b321a/extract-utilities/0.log" Jan 28 20:01:46 crc kubenswrapper[4749]: I0128 20:01:46.235006 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhkfd_b8bbe45a-affd-4b0f-81e1-dcf1db6b321a/extract-utilities/0.log" Jan 28 20:01:46 crc kubenswrapper[4749]: I0128 20:01:46.254960 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhkfd_b8bbe45a-affd-4b0f-81e1-dcf1db6b321a/extract-content/0.log" Jan 28 20:01:46 crc kubenswrapper[4749]: I0128 20:01:46.796026 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vhkfd_b8bbe45a-affd-4b0f-81e1-dcf1db6b321a/registry-server/0.log" Jan 28 20:01:50 crc kubenswrapper[4749]: I0128 20:01:50.873267 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:01:50 crc kubenswrapper[4749]: E0128 20:01:50.874846 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:01:59 crc kubenswrapper[4749]: I0128 20:01:59.773045 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d7b69c7d5-nlds4_84cf86f4-9829-41b0-8151-028ef75f861e/prometheus-operator-admission-webhook/0.log" Jan 28 20:01:59 crc kubenswrapper[4749]: I0128 20:01:59.786473 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-hmgb7_2f8457f9-3010-4acc-88d1-97e5bec85c2c/prometheus-operator/0.log" Jan 28 20:01:59 crc kubenswrapper[4749]: I0128 20:01:59.816636 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d7b69c7d5-kkg5b_4f02f0f9-aa18-4986-af31-25b776f67fb7/prometheus-operator-admission-webhook/0.log" Jan 28 20:01:59 crc kubenswrapper[4749]: I0128 20:01:59.983983 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-9lj2z_5d090067-e17c-4708-8d06-23dd0e9da1dc/observability-ui-dashboards/0.log" Jan 28 20:01:59 crc kubenswrapper[4749]: I0128 20:01:59.987772 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-4kwb4_502b92eb-ac87-456c-933a-7e9ff562e326/operator/0.log" Jan 28 20:02:00 crc kubenswrapper[4749]: I0128 20:02:00.019057 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-c6vlq_45c3f22c-a523-4e94-858c-97bdb2705b9e/perses-operator/0.log" Jan 28 20:02:03 crc kubenswrapper[4749]: I0128 20:02:03.871711 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:02:03 crc kubenswrapper[4749]: E0128 20:02:03.874012 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:02:15 crc kubenswrapper[4749]: I0128 20:02:15.347975 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-76ff55d55d-mdgpw_5ee54d7a-a2c0-4238-be42-ba0843c776ef/kube-rbac-proxy/0.log" Jan 28 20:02:15 crc kubenswrapper[4749]: I0128 20:02:15.457754 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-76ff55d55d-mdgpw_5ee54d7a-a2c0-4238-be42-ba0843c776ef/manager/0.log" Jan 28 20:02:16 crc kubenswrapper[4749]: I0128 20:02:16.872067 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:02:16 crc kubenswrapper[4749]: E0128 20:02:16.872942 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.627251 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7nztq"] Jan 28 20:02:24 crc kubenswrapper[4749]: E0128 20:02:24.642727 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb73c82e-f8c3-4bae-be63-63cd103bee2b" containerName="keystone-cron" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.642748 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb73c82e-f8c3-4bae-be63-63cd103bee2b" containerName="keystone-cron" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.643067 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb73c82e-f8c3-4bae-be63-63cd103bee2b" containerName="keystone-cron" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.645232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.645872 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7nztq"] Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.831655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-catalog-content\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.831743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-utilities\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.831903 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m975\" (UniqueName: \"kubernetes.io/projected/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-kube-api-access-2m975\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.934498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-catalog-content\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.934572 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-utilities\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.934623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m975\" (UniqueName: \"kubernetes.io/projected/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-kube-api-access-2m975\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.935153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-catalog-content\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.935183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-utilities\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.964384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m975\" (UniqueName: \"kubernetes.io/projected/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-kube-api-access-2m975\") pod \"community-operators-7nztq\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:24 crc kubenswrapper[4749]: I0128 20:02:24.980301 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:25 crc kubenswrapper[4749]: I0128 20:02:25.627161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7nztq"] Jan 28 20:02:25 crc kubenswrapper[4749]: I0128 20:02:25.688915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nztq" event={"ID":"bddaa014-944f-40bf-b2f7-5c8dfae4f2db","Type":"ContainerStarted","Data":"f7e9652fb0bc4e277236d4d58d5c88e08f460b599c60363f20937d68d34d2835"} Jan 28 20:02:26 crc kubenswrapper[4749]: I0128 20:02:26.707289 4749 generic.go:334] "Generic (PLEG): container finished" podID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerID="dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e" exitCode=0 Jan 28 20:02:26 crc kubenswrapper[4749]: I0128 20:02:26.707423 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nztq" event={"ID":"bddaa014-944f-40bf-b2f7-5c8dfae4f2db","Type":"ContainerDied","Data":"dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e"} Jan 28 20:02:26 crc kubenswrapper[4749]: I0128 20:02:26.709936 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 20:02:27 crc kubenswrapper[4749]: I0128 20:02:27.873137 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:02:27 crc kubenswrapper[4749]: E0128 20:02:27.873842 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:02:28 crc kubenswrapper[4749]: I0128 20:02:28.737512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nztq" event={"ID":"bddaa014-944f-40bf-b2f7-5c8dfae4f2db","Type":"ContainerStarted","Data":"c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28"} Jan 28 20:02:30 crc kubenswrapper[4749]: I0128 20:02:30.756570 4749 generic.go:334] "Generic (PLEG): container finished" podID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerID="c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28" exitCode=0 Jan 28 20:02:30 crc kubenswrapper[4749]: I0128 20:02:30.756653 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nztq" event={"ID":"bddaa014-944f-40bf-b2f7-5c8dfae4f2db","Type":"ContainerDied","Data":"c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28"} Jan 28 20:02:32 crc kubenswrapper[4749]: I0128 20:02:32.790934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nztq" event={"ID":"bddaa014-944f-40bf-b2f7-5c8dfae4f2db","Type":"ContainerStarted","Data":"4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286"} Jan 28 20:02:32 crc kubenswrapper[4749]: I0128 20:02:32.845349 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7nztq" podStartSLOduration=4.384458421 podStartE2EDuration="8.845311057s" podCreationTimestamp="2026-01-28 20:02:24 +0000 UTC" firstStartedPulling="2026-01-28 20:02:26.709538599 +0000 UTC m=+5214.721065374" lastFinishedPulling="2026-01-28 20:02:31.170391235 +0000 UTC m=+5219.181918010" observedRunningTime="2026-01-28 20:02:32.843736909 +0000 UTC m=+5220.855263684" watchObservedRunningTime="2026-01-28 20:02:32.845311057 +0000 UTC m=+5220.856837832" Jan 28 20:02:34 crc kubenswrapper[4749]: I0128 20:02:34.980745 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:34 crc kubenswrapper[4749]: I0128 20:02:34.981227 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:36 crc kubenswrapper[4749]: I0128 20:02:36.051462 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7nztq" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="registry-server" probeResult="failure" output=< Jan 28 20:02:36 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 20:02:36 crc kubenswrapper[4749]: > Jan 28 20:02:39 crc kubenswrapper[4749]: I0128 20:02:39.872371 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:02:39 crc kubenswrapper[4749]: E0128 20:02:39.873242 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:02:45 crc kubenswrapper[4749]: I0128 20:02:45.042999 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:45 crc kubenswrapper[4749]: I0128 20:02:45.126570 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:45 crc kubenswrapper[4749]: I0128 20:02:45.297416 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7nztq"] Jan 28 20:02:46 crc kubenswrapper[4749]: I0128 20:02:46.932574 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7nztq" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="registry-server" containerID="cri-o://4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286" gracePeriod=2 Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.608028 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.726127 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m975\" (UniqueName: \"kubernetes.io/projected/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-kube-api-access-2m975\") pod \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.726231 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-utilities\") pod \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.726612 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-catalog-content\") pod \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\" (UID: \"bddaa014-944f-40bf-b2f7-5c8dfae4f2db\") " Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.741873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-utilities" (OuterVolumeSpecName: "utilities") pod "bddaa014-944f-40bf-b2f7-5c8dfae4f2db" (UID: "bddaa014-944f-40bf-b2f7-5c8dfae4f2db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.743878 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-kube-api-access-2m975" (OuterVolumeSpecName: "kube-api-access-2m975") pod "bddaa014-944f-40bf-b2f7-5c8dfae4f2db" (UID: "bddaa014-944f-40bf-b2f7-5c8dfae4f2db"). InnerVolumeSpecName "kube-api-access-2m975". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.830204 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m975\" (UniqueName: \"kubernetes.io/projected/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-kube-api-access-2m975\") on node \"crc\" DevicePath \"\"" Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.830249 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.842200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bddaa014-944f-40bf-b2f7-5c8dfae4f2db" (UID: "bddaa014-944f-40bf-b2f7-5c8dfae4f2db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.932427 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bddaa014-944f-40bf-b2f7-5c8dfae4f2db-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.978696 4749 generic.go:334] "Generic (PLEG): container finished" podID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerID="4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286" exitCode=0 Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.978737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nztq" event={"ID":"bddaa014-944f-40bf-b2f7-5c8dfae4f2db","Type":"ContainerDied","Data":"4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286"} Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.978763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nztq" event={"ID":"bddaa014-944f-40bf-b2f7-5c8dfae4f2db","Type":"ContainerDied","Data":"f7e9652fb0bc4e277236d4d58d5c88e08f460b599c60363f20937d68d34d2835"} Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.978784 4749 scope.go:117] "RemoveContainer" containerID="4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286" Jan 28 20:02:47 crc kubenswrapper[4749]: I0128 20:02:47.978937 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nztq" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.027519 4749 scope.go:117] "RemoveContainer" containerID="c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.036187 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7nztq"] Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.048963 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7nztq"] Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.075690 4749 scope.go:117] "RemoveContainer" containerID="dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.141754 4749 scope.go:117] "RemoveContainer" containerID="4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286" Jan 28 20:02:48 crc kubenswrapper[4749]: E0128 20:02:48.142841 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286\": container with ID starting with 4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286 not found: ID does not exist" containerID="4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.142892 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286"} err="failed to get container status \"4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286\": rpc error: code = NotFound desc = could not find container \"4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286\": container with ID starting with 4e79a61dc9ead825ff5823d008350385c97f44c2276399f7ac99fd67c57a2286 not found: ID does not exist" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.142922 4749 scope.go:117] "RemoveContainer" containerID="c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28" Jan 28 20:02:48 crc kubenswrapper[4749]: E0128 20:02:48.145188 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28\": container with ID starting with c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28 not found: ID does not exist" containerID="c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.145379 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28"} err="failed to get container status \"c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28\": rpc error: code = NotFound desc = could not find container \"c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28\": container with ID starting with c8b9f67f177d06370e9cf2c0f5d4299de7960a386d5150dbb33dd8409485ed28 not found: ID does not exist" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.145469 4749 scope.go:117] "RemoveContainer" containerID="dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e" Jan 28 20:02:48 crc kubenswrapper[4749]: E0128 20:02:48.145993 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e\": container with ID starting with dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e not found: ID does not exist" containerID="dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.146058 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e"} err="failed to get container status \"dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e\": rpc error: code = NotFound desc = could not find container \"dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e\": container with ID starting with dcbf94a3d13bc89f4f530ea2935a5adb2eee91b754688ca8dcc395072eb6f39e not found: ID does not exist" Jan 28 20:02:48 crc kubenswrapper[4749]: I0128 20:02:48.885873 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" path="/var/lib/kubelet/pods/bddaa014-944f-40bf-b2f7-5c8dfae4f2db/volumes" Jan 28 20:02:52 crc kubenswrapper[4749]: I0128 20:02:52.880583 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:02:52 crc kubenswrapper[4749]: E0128 20:02:52.881382 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:03:04 crc kubenswrapper[4749]: I0128 20:03:04.877658 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:03:04 crc kubenswrapper[4749]: E0128 20:03:04.878593 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:03:15 crc kubenswrapper[4749]: I0128 20:03:15.871853 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:03:15 crc kubenswrapper[4749]: E0128 20:03:15.872910 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:03:29 crc kubenswrapper[4749]: I0128 20:03:29.872859 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:03:29 crc kubenswrapper[4749]: E0128 20:03:29.873758 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:03:44 crc kubenswrapper[4749]: I0128 20:03:44.872818 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:03:44 crc kubenswrapper[4749]: E0128 20:03:44.873614 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:03:59 crc kubenswrapper[4749]: I0128 20:03:59.871983 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:03:59 crc kubenswrapper[4749]: E0128 20:03:59.872723 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:04:04 crc kubenswrapper[4749]: I0128 20:04:04.747822 4749 generic.go:334] "Generic (PLEG): container finished" podID="6b53256e-31af-4b31-908a-e0341acbf58a" containerID="3cb8995d08f92a29ffd672daf9e937e1f0a1ffec79603aaa9c4d7df779f58610" exitCode=0 Jan 28 20:04:04 crc kubenswrapper[4749]: I0128 20:04:04.747911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" event={"ID":"6b53256e-31af-4b31-908a-e0341acbf58a","Type":"ContainerDied","Data":"3cb8995d08f92a29ffd672daf9e937e1f0a1ffec79603aaa9c4d7df779f58610"} Jan 28 20:04:04 crc kubenswrapper[4749]: I0128 20:04:04.749148 4749 scope.go:117] "RemoveContainer" containerID="3cb8995d08f92a29ffd672daf9e937e1f0a1ffec79603aaa9c4d7df779f58610" Jan 28 20:04:05 crc kubenswrapper[4749]: I0128 20:04:05.577188 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p5nz9_must-gather-rbf9q_6b53256e-31af-4b31-908a-e0341acbf58a/gather/0.log" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.005617 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x27qr"] Jan 28 20:04:06 crc kubenswrapper[4749]: E0128 20:04:06.006526 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="extract-content" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.006549 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="extract-content" Jan 28 20:04:06 crc kubenswrapper[4749]: E0128 20:04:06.006605 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="extract-utilities" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.006616 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="extract-utilities" Jan 28 20:04:06 crc kubenswrapper[4749]: E0128 20:04:06.006629 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="registry-server" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.006636 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="registry-server" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.006960 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bddaa014-944f-40bf-b2f7-5c8dfae4f2db" containerName="registry-server" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.008968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.019817 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x27qr"] Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.101593 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9lsp\" (UniqueName: \"kubernetes.io/projected/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-kube-api-access-l9lsp\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.101688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-utilities\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.101708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-catalog-content\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.203603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9lsp\" (UniqueName: \"kubernetes.io/projected/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-kube-api-access-l9lsp\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.203920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-utilities\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.203997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-catalog-content\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.204556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-utilities\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.204593 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-catalog-content\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.222988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9lsp\" (UniqueName: \"kubernetes.io/projected/85c64fd7-ac30-465c-90b8-f4a2f5f06a63-kube-api-access-l9lsp\") pod \"certified-operators-x27qr\" (UID: \"85c64fd7-ac30-465c-90b8-f4a2f5f06a63\") " pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.333360 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:06 crc kubenswrapper[4749]: W0128 20:04:06.951722 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c64fd7_ac30_465c_90b8_f4a2f5f06a63.slice/crio-e8a6662e0423375454abd58c5efa389e0b7060d758438a86ec1468ad0c5171f8 WatchSource:0}: Error finding container e8a6662e0423375454abd58c5efa389e0b7060d758438a86ec1468ad0c5171f8: Status 404 returned error can't find the container with id e8a6662e0423375454abd58c5efa389e0b7060d758438a86ec1468ad0c5171f8 Jan 28 20:04:06 crc kubenswrapper[4749]: I0128 20:04:06.960800 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x27qr"] Jan 28 20:04:07 crc kubenswrapper[4749]: I0128 20:04:07.785155 4749 generic.go:334] "Generic (PLEG): container finished" podID="85c64fd7-ac30-465c-90b8-f4a2f5f06a63" containerID="e49a910c2f9b0d7f93ce37420513b50ab39fbc407b44cf19d0f4a3e7d9dbe3cb" exitCode=0 Jan 28 20:04:07 crc kubenswrapper[4749]: I0128 20:04:07.785210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x27qr" event={"ID":"85c64fd7-ac30-465c-90b8-f4a2f5f06a63","Type":"ContainerDied","Data":"e49a910c2f9b0d7f93ce37420513b50ab39fbc407b44cf19d0f4a3e7d9dbe3cb"} Jan 28 20:04:07 crc kubenswrapper[4749]: I0128 20:04:07.785486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x27qr" event={"ID":"85c64fd7-ac30-465c-90b8-f4a2f5f06a63","Type":"ContainerStarted","Data":"e8a6662e0423375454abd58c5efa389e0b7060d758438a86ec1468ad0c5171f8"} Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.021438 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qb4r"] Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.025554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.058150 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qb4r"] Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.189563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-catalog-content\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.189786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtdr6\" (UniqueName: \"kubernetes.io/projected/d9b93b85-66de-40b0-998d-16e3146b4c66-kube-api-access-qtdr6\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.189907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-utilities\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.293460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-utilities\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.293723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-catalog-content\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.293784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtdr6\" (UniqueName: \"kubernetes.io/projected/d9b93b85-66de-40b0-998d-16e3146b4c66-kube-api-access-qtdr6\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.294351 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-utilities\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.294448 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-catalog-content\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.338913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtdr6\" (UniqueName: \"kubernetes.io/projected/d9b93b85-66de-40b0-998d-16e3146b4c66-kube-api-access-qtdr6\") pod \"redhat-operators-7qb4r\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.348237 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:09 crc kubenswrapper[4749]: I0128 20:04:09.936665 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qb4r"] Jan 28 20:04:10 crc kubenswrapper[4749]: I0128 20:04:10.818239 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerID="25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109" exitCode=0 Jan 28 20:04:10 crc kubenswrapper[4749]: I0128 20:04:10.818303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qb4r" event={"ID":"d9b93b85-66de-40b0-998d-16e3146b4c66","Type":"ContainerDied","Data":"25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109"} Jan 28 20:04:10 crc kubenswrapper[4749]: I0128 20:04:10.818372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qb4r" event={"ID":"d9b93b85-66de-40b0-998d-16e3146b4c66","Type":"ContainerStarted","Data":"6dfab695599f7470d13dfff403d53c8728516f62dbfcbb79e594c0b80020d305"} Jan 28 20:04:12 crc kubenswrapper[4749]: I0128 20:04:12.909537 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:04:12 crc kubenswrapper[4749]: E0128 20:04:12.910353 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:04:12 crc kubenswrapper[4749]: I0128 20:04:12.988821 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p5nz9/must-gather-rbf9q"] Jan 28 20:04:12 crc kubenswrapper[4749]: I0128 20:04:12.989899 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" podUID="6b53256e-31af-4b31-908a-e0341acbf58a" containerName="copy" containerID="cri-o://86e914daf6c43a11a5468bc809932f11059f841ca66847c98fb71dbf67aec2e1" gracePeriod=2 Jan 28 20:04:13 crc kubenswrapper[4749]: I0128 20:04:13.002418 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p5nz9/must-gather-rbf9q"] Jan 28 20:04:13 crc kubenswrapper[4749]: I0128 20:04:13.854709 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p5nz9_must-gather-rbf9q_6b53256e-31af-4b31-908a-e0341acbf58a/copy/0.log" Jan 28 20:04:13 crc kubenswrapper[4749]: I0128 20:04:13.855400 4749 generic.go:334] "Generic (PLEG): container finished" podID="6b53256e-31af-4b31-908a-e0341acbf58a" containerID="86e914daf6c43a11a5468bc809932f11059f841ca66847c98fb71dbf67aec2e1" exitCode=143 Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.192881 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p5nz9_must-gather-rbf9q_6b53256e-31af-4b31-908a-e0341acbf58a/copy/0.log" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.193614 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.330645 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m64zw\" (UniqueName: \"kubernetes.io/projected/6b53256e-31af-4b31-908a-e0341acbf58a-kube-api-access-m64zw\") pod \"6b53256e-31af-4b31-908a-e0341acbf58a\" (UID: \"6b53256e-31af-4b31-908a-e0341acbf58a\") " Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.331157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b53256e-31af-4b31-908a-e0341acbf58a-must-gather-output\") pod \"6b53256e-31af-4b31-908a-e0341acbf58a\" (UID: \"6b53256e-31af-4b31-908a-e0341acbf58a\") " Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.338860 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b53256e-31af-4b31-908a-e0341acbf58a-kube-api-access-m64zw" (OuterVolumeSpecName: "kube-api-access-m64zw") pod "6b53256e-31af-4b31-908a-e0341acbf58a" (UID: "6b53256e-31af-4b31-908a-e0341acbf58a"). InnerVolumeSpecName "kube-api-access-m64zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.434239 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m64zw\" (UniqueName: \"kubernetes.io/projected/6b53256e-31af-4b31-908a-e0341acbf58a-kube-api-access-m64zw\") on node \"crc\" DevicePath \"\"" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.503669 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b53256e-31af-4b31-908a-e0341acbf58a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6b53256e-31af-4b31-908a-e0341acbf58a" (UID: "6b53256e-31af-4b31-908a-e0341acbf58a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.537522 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b53256e-31af-4b31-908a-e0341acbf58a-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.867790 4749 generic.go:334] "Generic (PLEG): container finished" podID="85c64fd7-ac30-465c-90b8-f4a2f5f06a63" containerID="a22f0b422a5bdda7a4e845c2995531b4825fdafc46381701253482e02c842cf4" exitCode=0 Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.868133 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x27qr" event={"ID":"85c64fd7-ac30-465c-90b8-f4a2f5f06a63","Type":"ContainerDied","Data":"a22f0b422a5bdda7a4e845c2995531b4825fdafc46381701253482e02c842cf4"} Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.875288 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p5nz9_must-gather-rbf9q_6b53256e-31af-4b31-908a-e0341acbf58a/copy/0.log" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.875952 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p5nz9/must-gather-rbf9q" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.884124 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b53256e-31af-4b31-908a-e0341acbf58a" path="/var/lib/kubelet/pods/6b53256e-31af-4b31-908a-e0341acbf58a/volumes" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.885117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qb4r" event={"ID":"d9b93b85-66de-40b0-998d-16e3146b4c66","Type":"ContainerStarted","Data":"fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5"} Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.885167 4749 scope.go:117] "RemoveContainer" containerID="86e914daf6c43a11a5468bc809932f11059f841ca66847c98fb71dbf67aec2e1" Jan 28 20:04:14 crc kubenswrapper[4749]: I0128 20:04:14.914126 4749 scope.go:117] "RemoveContainer" containerID="3cb8995d08f92a29ffd672daf9e937e1f0a1ffec79603aaa9c4d7df779f58610" Jan 28 20:04:16 crc kubenswrapper[4749]: I0128 20:04:16.897052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x27qr" event={"ID":"85c64fd7-ac30-465c-90b8-f4a2f5f06a63","Type":"ContainerStarted","Data":"d6444fc0a418659521900853ca6ecc1f2cca66e3420b292eef7dde26bcb75b4b"} Jan 28 20:04:16 crc kubenswrapper[4749]: I0128 20:04:16.940266 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x27qr" podStartSLOduration=4.038772212 podStartE2EDuration="11.94024586s" podCreationTimestamp="2026-01-28 20:04:05 +0000 UTC" firstStartedPulling="2026-01-28 20:04:07.788909918 +0000 UTC m=+5315.800436693" lastFinishedPulling="2026-01-28 20:04:15.690383566 +0000 UTC m=+5323.701910341" observedRunningTime="2026-01-28 20:04:16.912783774 +0000 UTC m=+5324.924310569" watchObservedRunningTime="2026-01-28 20:04:16.94024586 +0000 UTC m=+5324.951772635" Jan 28 20:04:20 crc kubenswrapper[4749]: I0128 20:04:20.959518 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerID="fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5" exitCode=0 Jan 28 20:04:20 crc kubenswrapper[4749]: I0128 20:04:20.960117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qb4r" event={"ID":"d9b93b85-66de-40b0-998d-16e3146b4c66","Type":"ContainerDied","Data":"fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5"} Jan 28 20:04:22 crc kubenswrapper[4749]: I0128 20:04:22.982941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qb4r" event={"ID":"d9b93b85-66de-40b0-998d-16e3146b4c66","Type":"ContainerStarted","Data":"9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c"} Jan 28 20:04:23 crc kubenswrapper[4749]: I0128 20:04:23.021817 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qb4r" podStartSLOduration=4.292422893 podStartE2EDuration="15.02179507s" podCreationTimestamp="2026-01-28 20:04:08 +0000 UTC" firstStartedPulling="2026-01-28 20:04:10.821592201 +0000 UTC m=+5318.833118976" lastFinishedPulling="2026-01-28 20:04:21.550964378 +0000 UTC m=+5329.562491153" observedRunningTime="2026-01-28 20:04:23.012727979 +0000 UTC m=+5331.024254754" watchObservedRunningTime="2026-01-28 20:04:23.02179507 +0000 UTC m=+5331.033321845" Jan 28 20:04:23 crc kubenswrapper[4749]: I0128 20:04:23.872253 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:04:23 crc kubenswrapper[4749]: E0128 20:04:23.872778 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-698zt_openshift-machine-config-operator(1841c82d-7cd1-4c14-b54d-794bbb647776)\"" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" Jan 28 20:04:26 crc kubenswrapper[4749]: I0128 20:04:26.334245 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:26 crc kubenswrapper[4749]: I0128 20:04:26.334776 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:26 crc kubenswrapper[4749]: I0128 20:04:26.384192 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:27 crc kubenswrapper[4749]: I0128 20:04:27.086737 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x27qr" Jan 28 20:04:28 crc kubenswrapper[4749]: I0128 20:04:28.255187 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x27qr"] Jan 28 20:04:28 crc kubenswrapper[4749]: I0128 20:04:28.407594 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mq9lg"] Jan 28 20:04:28 crc kubenswrapper[4749]: I0128 20:04:28.408271 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mq9lg" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="registry-server" containerID="cri-o://8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b" gracePeriod=2 Jan 28 20:04:28 crc kubenswrapper[4749]: I0128 20:04:28.963651 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.074958 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerID="8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b" exitCode=0 Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.075099 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mq9lg" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.075105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9lg" event={"ID":"8e5a0c35-40e6-424a-9a03-377de71895bb","Type":"ContainerDied","Data":"8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b"} Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.075165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9lg" event={"ID":"8e5a0c35-40e6-424a-9a03-377de71895bb","Type":"ContainerDied","Data":"a4f9b43a08c7e16f68f6f435b0bb11976e2408c6897b123eb1145bd90e4b0735"} Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.075221 4749 scope.go:117] "RemoveContainer" containerID="8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.101952 4749 scope.go:117] "RemoveContainer" containerID="f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.126075 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxpqc\" (UniqueName: \"kubernetes.io/projected/8e5a0c35-40e6-424a-9a03-377de71895bb-kube-api-access-zxpqc\") pod \"8e5a0c35-40e6-424a-9a03-377de71895bb\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.126252 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-catalog-content\") pod \"8e5a0c35-40e6-424a-9a03-377de71895bb\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.126411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-utilities\") pod \"8e5a0c35-40e6-424a-9a03-377de71895bb\" (UID: \"8e5a0c35-40e6-424a-9a03-377de71895bb\") " Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.127148 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-utilities" (OuterVolumeSpecName: "utilities") pod "8e5a0c35-40e6-424a-9a03-377de71895bb" (UID: "8e5a0c35-40e6-424a-9a03-377de71895bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.129638 4749 scope.go:117] "RemoveContainer" containerID="76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.135201 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5a0c35-40e6-424a-9a03-377de71895bb-kube-api-access-zxpqc" (OuterVolumeSpecName: "kube-api-access-zxpqc") pod "8e5a0c35-40e6-424a-9a03-377de71895bb" (UID: "8e5a0c35-40e6-424a-9a03-377de71895bb"). InnerVolumeSpecName "kube-api-access-zxpqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.175349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e5a0c35-40e6-424a-9a03-377de71895bb" (UID: "8e5a0c35-40e6-424a-9a03-377de71895bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.229790 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.229841 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxpqc\" (UniqueName: \"kubernetes.io/projected/8e5a0c35-40e6-424a-9a03-377de71895bb-kube-api-access-zxpqc\") on node \"crc\" DevicePath \"\"" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.229853 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e5a0c35-40e6-424a-9a03-377de71895bb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.256206 4749 scope.go:117] "RemoveContainer" containerID="8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b" Jan 28 20:04:29 crc kubenswrapper[4749]: E0128 20:04:29.256709 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b\": container with ID starting with 8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b not found: ID does not exist" containerID="8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.256745 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b"} err="failed to get container status \"8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b\": rpc error: code = NotFound desc = could not find container \"8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b\": container with ID starting with 8e8fb982e3d938e18aa7ae7ec1f11574d4dcd8486d366f3a48523a711217d00b not found: ID does not exist" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.256770 4749 scope.go:117] "RemoveContainer" containerID="f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827" Jan 28 20:04:29 crc kubenswrapper[4749]: E0128 20:04:29.257066 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827\": container with ID starting with f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827 not found: ID does not exist" containerID="f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.257093 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827"} err="failed to get container status \"f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827\": rpc error: code = NotFound desc = could not find container \"f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827\": container with ID starting with f79ca334581035f2c80a95832259fdbfa4f3511531d8d9fc7043d709a8095827 not found: ID does not exist" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.257113 4749 scope.go:117] "RemoveContainer" containerID="76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02" Jan 28 20:04:29 crc kubenswrapper[4749]: E0128 20:04:29.257294 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02\": container with ID starting with 76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02 not found: ID does not exist" containerID="76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.257315 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02"} err="failed to get container status \"76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02\": rpc error: code = NotFound desc = could not find container \"76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02\": container with ID starting with 76bb43fd4adbfc5cbf9fa62eadb7a1e129a6442fa2f3bfc35843b70d83266f02 not found: ID does not exist" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.349418 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.349459 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.410744 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mq9lg"] Jan 28 20:04:29 crc kubenswrapper[4749]: I0128 20:04:29.421699 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mq9lg"] Jan 28 20:04:30 crc kubenswrapper[4749]: I0128 20:04:30.410416 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qb4r" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="registry-server" probeResult="failure" output=< Jan 28 20:04:30 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 20:04:30 crc kubenswrapper[4749]: > Jan 28 20:04:30 crc kubenswrapper[4749]: I0128 20:04:30.884384 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" path="/var/lib/kubelet/pods/8e5a0c35-40e6-424a-9a03-377de71895bb/volumes" Jan 28 20:04:36 crc kubenswrapper[4749]: I0128 20:04:36.872359 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" Jan 28 20:04:37 crc kubenswrapper[4749]: I0128 20:04:37.155076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"a8bf5654419aae947fd0353e431b43b4ba3418ed4cd23109eca09d9d740730d4"} Jan 28 20:04:40 crc kubenswrapper[4749]: I0128 20:04:40.400749 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qb4r" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="registry-server" probeResult="failure" output=< Jan 28 20:04:40 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 20:04:40 crc kubenswrapper[4749]: > Jan 28 20:04:50 crc kubenswrapper[4749]: I0128 20:04:50.397825 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qb4r" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="registry-server" probeResult="failure" output=< Jan 28 20:04:50 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Jan 28 20:04:50 crc kubenswrapper[4749]: > Jan 28 20:04:59 crc kubenswrapper[4749]: I0128 20:04:59.433073 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:04:59 crc kubenswrapper[4749]: I0128 20:04:59.501818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:05:00 crc kubenswrapper[4749]: I0128 20:05:00.825023 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qb4r"] Jan 28 20:05:01 crc kubenswrapper[4749]: I0128 20:05:01.386393 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qb4r" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="registry-server" containerID="cri-o://9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c" gracePeriod=2 Jan 28 20:05:01 crc kubenswrapper[4749]: I0128 20:05:01.910757 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.012678 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-utilities\") pod \"d9b93b85-66de-40b0-998d-16e3146b4c66\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.012829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtdr6\" (UniqueName: \"kubernetes.io/projected/d9b93b85-66de-40b0-998d-16e3146b4c66-kube-api-access-qtdr6\") pod \"d9b93b85-66de-40b0-998d-16e3146b4c66\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.012992 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-catalog-content\") pod \"d9b93b85-66de-40b0-998d-16e3146b4c66\" (UID: \"d9b93b85-66de-40b0-998d-16e3146b4c66\") " Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.013478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-utilities" (OuterVolumeSpecName: "utilities") pod "d9b93b85-66de-40b0-998d-16e3146b4c66" (UID: "d9b93b85-66de-40b0-998d-16e3146b4c66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.014074 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.018709 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b93b85-66de-40b0-998d-16e3146b4c66-kube-api-access-qtdr6" (OuterVolumeSpecName: "kube-api-access-qtdr6") pod "d9b93b85-66de-40b0-998d-16e3146b4c66" (UID: "d9b93b85-66de-40b0-998d-16e3146b4c66"). InnerVolumeSpecName "kube-api-access-qtdr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.116220 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtdr6\" (UniqueName: \"kubernetes.io/projected/d9b93b85-66de-40b0-998d-16e3146b4c66-kube-api-access-qtdr6\") on node \"crc\" DevicePath \"\"" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.120307 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9b93b85-66de-40b0-998d-16e3146b4c66" (UID: "d9b93b85-66de-40b0-998d-16e3146b4c66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.219150 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b93b85-66de-40b0-998d-16e3146b4c66-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.399024 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerID="9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c" exitCode=0 Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.399068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qb4r" event={"ID":"d9b93b85-66de-40b0-998d-16e3146b4c66","Type":"ContainerDied","Data":"9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c"} Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.399095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qb4r" event={"ID":"d9b93b85-66de-40b0-998d-16e3146b4c66","Type":"ContainerDied","Data":"6dfab695599f7470d13dfff403d53c8728516f62dbfcbb79e594c0b80020d305"} Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.399112 4749 scope.go:117] "RemoveContainer" containerID="9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.399106 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qb4r" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.422884 4749 scope.go:117] "RemoveContainer" containerID="fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.438938 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qb4r"] Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.449889 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qb4r"] Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.456931 4749 scope.go:117] "RemoveContainer" containerID="25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.501459 4749 scope.go:117] "RemoveContainer" containerID="9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c" Jan 28 20:05:02 crc kubenswrapper[4749]: E0128 20:05:02.501964 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c\": container with ID starting with 9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c not found: ID does not exist" containerID="9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.502003 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c"} err="failed to get container status \"9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c\": rpc error: code = NotFound desc = could not find container \"9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c\": container with ID starting with 9f92fb030b33357e58be13ee944343736d682e1709291221de03b064132a9d4c not found: ID does not exist" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.502027 4749 scope.go:117] "RemoveContainer" containerID="fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5" Jan 28 20:05:02 crc kubenswrapper[4749]: E0128 20:05:02.502490 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5\": container with ID starting with fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5 not found: ID does not exist" containerID="fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.502567 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5"} err="failed to get container status \"fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5\": rpc error: code = NotFound desc = could not find container \"fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5\": container with ID starting with fdccf837154f4f3b0a8517e1f033339f7730f3655c94623067c903e9e748f4e5 not found: ID does not exist" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.502600 4749 scope.go:117] "RemoveContainer" containerID="25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109" Jan 28 20:05:02 crc kubenswrapper[4749]: E0128 20:05:02.502934 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109\": container with ID starting with 25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109 not found: ID does not exist" containerID="25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.502960 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109"} err="failed to get container status \"25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109\": rpc error: code = NotFound desc = could not find container \"25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109\": container with ID starting with 25d2a103de800912d2f6027d0a16ab58bb88d8a4ed0e1c039957899ef6b29109 not found: ID does not exist" Jan 28 20:05:02 crc kubenswrapper[4749]: I0128 20:05:02.882511 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" path="/var/lib/kubelet/pods/d9b93b85-66de-40b0-998d-16e3146b4c66/volumes" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.519882 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zj9wp"] Jan 28 20:06:05 crc kubenswrapper[4749]: E0128 20:06:05.521072 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="registry-server" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521091 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="registry-server" Jan 28 20:06:05 crc kubenswrapper[4749]: E0128 20:06:05.521109 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b53256e-31af-4b31-908a-e0341acbf58a" containerName="gather" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521117 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b53256e-31af-4b31-908a-e0341acbf58a" containerName="gather" Jan 28 20:06:05 crc kubenswrapper[4749]: E0128 20:06:05.521133 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="extract-utilities" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521141 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="extract-utilities" Jan 28 20:06:05 crc kubenswrapper[4749]: E0128 20:06:05.521184 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="extract-content" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521193 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="extract-content" Jan 28 20:06:05 crc kubenswrapper[4749]: E0128 20:06:05.521202 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="extract-utilities" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521209 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="extract-utilities" Jan 28 20:06:05 crc kubenswrapper[4749]: E0128 20:06:05.521232 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="registry-server" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521239 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="registry-server" Jan 28 20:06:05 crc kubenswrapper[4749]: E0128 20:06:05.521255 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="extract-content" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521263 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="extract-content" Jan 28 20:06:05 crc kubenswrapper[4749]: E0128 20:06:05.521278 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b53256e-31af-4b31-908a-e0341acbf58a" containerName="copy" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521285 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b53256e-31af-4b31-908a-e0341acbf58a" containerName="copy" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521862 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5a0c35-40e6-424a-9a03-377de71895bb" containerName="registry-server" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521895 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b53256e-31af-4b31-908a-e0341acbf58a" containerName="copy" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521908 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b93b85-66de-40b0-998d-16e3146b4c66" containerName="registry-server" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.521916 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b53256e-31af-4b31-908a-e0341acbf58a" containerName="gather" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.524030 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.551464 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zj9wp"] Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.673608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqtx7\" (UniqueName: \"kubernetes.io/projected/5a0df881-a48a-4727-8191-f37933d80b84-kube-api-access-tqtx7\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.673741 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-utilities\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.673791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-catalog-content\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.776506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-utilities\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.776567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-catalog-content\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.776733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqtx7\" (UniqueName: \"kubernetes.io/projected/5a0df881-a48a-4727-8191-f37933d80b84-kube-api-access-tqtx7\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.777027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-utilities\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.777263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-catalog-content\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.797341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqtx7\" (UniqueName: \"kubernetes.io/projected/5a0df881-a48a-4727-8191-f37933d80b84-kube-api-access-tqtx7\") pod \"redhat-marketplace-zj9wp\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:05 crc kubenswrapper[4749]: I0128 20:06:05.850838 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:06 crc kubenswrapper[4749]: I0128 20:06:06.399849 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zj9wp"] Jan 28 20:06:06 crc kubenswrapper[4749]: W0128 20:06:06.440985 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0df881_a48a_4727_8191_f37933d80b84.slice/crio-c8ba9cf7f533c25f9b5925924e2609eac05c92ea3e3f25773678048d62e02bc2 WatchSource:0}: Error finding container c8ba9cf7f533c25f9b5925924e2609eac05c92ea3e3f25773678048d62e02bc2: Status 404 returned error can't find the container with id c8ba9cf7f533c25f9b5925924e2609eac05c92ea3e3f25773678048d62e02bc2 Jan 28 20:06:07 crc kubenswrapper[4749]: I0128 20:06:07.065033 4749 generic.go:334] "Generic (PLEG): container finished" podID="5a0df881-a48a-4727-8191-f37933d80b84" containerID="15fe5c98577e36099f2a5a119799085467525cac72bfc49e368968c2c29ae2db" exitCode=0 Jan 28 20:06:07 crc kubenswrapper[4749]: I0128 20:06:07.065091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zj9wp" event={"ID":"5a0df881-a48a-4727-8191-f37933d80b84","Type":"ContainerDied","Data":"15fe5c98577e36099f2a5a119799085467525cac72bfc49e368968c2c29ae2db"} Jan 28 20:06:07 crc kubenswrapper[4749]: I0128 20:06:07.065343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zj9wp" event={"ID":"5a0df881-a48a-4727-8191-f37933d80b84","Type":"ContainerStarted","Data":"c8ba9cf7f533c25f9b5925924e2609eac05c92ea3e3f25773678048d62e02bc2"} Jan 28 20:06:09 crc kubenswrapper[4749]: I0128 20:06:09.085889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zj9wp" event={"ID":"5a0df881-a48a-4727-8191-f37933d80b84","Type":"ContainerStarted","Data":"35a19a9a0d110265442aba39467dc4815a0f85971eb836d9ae6fe2f74d7125b8"} Jan 28 20:06:10 crc kubenswrapper[4749]: I0128 20:06:10.096689 4749 generic.go:334] "Generic (PLEG): container finished" podID="5a0df881-a48a-4727-8191-f37933d80b84" containerID="35a19a9a0d110265442aba39467dc4815a0f85971eb836d9ae6fe2f74d7125b8" exitCode=0 Jan 28 20:06:10 crc kubenswrapper[4749]: I0128 20:06:10.096764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zj9wp" event={"ID":"5a0df881-a48a-4727-8191-f37933d80b84","Type":"ContainerDied","Data":"35a19a9a0d110265442aba39467dc4815a0f85971eb836d9ae6fe2f74d7125b8"} Jan 28 20:06:12 crc kubenswrapper[4749]: I0128 20:06:12.115076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zj9wp" event={"ID":"5a0df881-a48a-4727-8191-f37933d80b84","Type":"ContainerStarted","Data":"a16afcee8bacaab502db1bf76028ac60b9797860b85738572c62252a0f6fa071"} Jan 28 20:06:12 crc kubenswrapper[4749]: I0128 20:06:12.141983 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zj9wp" podStartSLOduration=3.134174162 podStartE2EDuration="7.141962577s" podCreationTimestamp="2026-01-28 20:06:05 +0000 UTC" firstStartedPulling="2026-01-28 20:06:07.067055375 +0000 UTC m=+5435.078582150" lastFinishedPulling="2026-01-28 20:06:11.07484379 +0000 UTC m=+5439.086370565" observedRunningTime="2026-01-28 20:06:12.135180642 +0000 UTC m=+5440.146707447" watchObservedRunningTime="2026-01-28 20:06:12.141962577 +0000 UTC m=+5440.153489352" Jan 28 20:06:15 crc kubenswrapper[4749]: I0128 20:06:15.851091 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:15 crc kubenswrapper[4749]: I0128 20:06:15.851833 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:15 crc kubenswrapper[4749]: I0128 20:06:15.902932 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:16 crc kubenswrapper[4749]: I0128 20:06:16.239501 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:16 crc kubenswrapper[4749]: I0128 20:06:16.292202 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zj9wp"] Jan 28 20:06:18 crc kubenswrapper[4749]: I0128 20:06:18.200077 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zj9wp" podUID="5a0df881-a48a-4727-8191-f37933d80b84" containerName="registry-server" containerID="cri-o://a16afcee8bacaab502db1bf76028ac60b9797860b85738572c62252a0f6fa071" gracePeriod=2 Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.213143 4749 generic.go:334] "Generic (PLEG): container finished" podID="5a0df881-a48a-4727-8191-f37933d80b84" containerID="a16afcee8bacaab502db1bf76028ac60b9797860b85738572c62252a0f6fa071" exitCode=0 Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.213232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zj9wp" event={"ID":"5a0df881-a48a-4727-8191-f37933d80b84","Type":"ContainerDied","Data":"a16afcee8bacaab502db1bf76028ac60b9797860b85738572c62252a0f6fa071"} Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.213710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zj9wp" event={"ID":"5a0df881-a48a-4727-8191-f37933d80b84","Type":"ContainerDied","Data":"c8ba9cf7f533c25f9b5925924e2609eac05c92ea3e3f25773678048d62e02bc2"} Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.213746 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8ba9cf7f533c25f9b5925924e2609eac05c92ea3e3f25773678048d62e02bc2" Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.316230 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.452749 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-catalog-content\") pod \"5a0df881-a48a-4727-8191-f37933d80b84\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.452938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqtx7\" (UniqueName: \"kubernetes.io/projected/5a0df881-a48a-4727-8191-f37933d80b84-kube-api-access-tqtx7\") pod \"5a0df881-a48a-4727-8191-f37933d80b84\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.453046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-utilities\") pod \"5a0df881-a48a-4727-8191-f37933d80b84\" (UID: \"5a0df881-a48a-4727-8191-f37933d80b84\") " Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.454675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-utilities" (OuterVolumeSpecName: "utilities") pod "5a0df881-a48a-4727-8191-f37933d80b84" (UID: "5a0df881-a48a-4727-8191-f37933d80b84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.462675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0df881-a48a-4727-8191-f37933d80b84-kube-api-access-tqtx7" (OuterVolumeSpecName: "kube-api-access-tqtx7") pod "5a0df881-a48a-4727-8191-f37933d80b84" (UID: "5a0df881-a48a-4727-8191-f37933d80b84"). InnerVolumeSpecName "kube-api-access-tqtx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.473566 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a0df881-a48a-4727-8191-f37933d80b84" (UID: "5a0df881-a48a-4727-8191-f37933d80b84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.555899 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.555935 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0df881-a48a-4727-8191-f37933d80b84-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:06:19 crc kubenswrapper[4749]: I0128 20:06:19.555951 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqtx7\" (UniqueName: \"kubernetes.io/projected/5a0df881-a48a-4727-8191-f37933d80b84-kube-api-access-tqtx7\") on node \"crc\" DevicePath \"\"" Jan 28 20:06:20 crc kubenswrapper[4749]: I0128 20:06:20.222359 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zj9wp" Jan 28 20:06:20 crc kubenswrapper[4749]: I0128 20:06:20.262664 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zj9wp"] Jan 28 20:06:20 crc kubenswrapper[4749]: I0128 20:06:20.274286 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zj9wp"] Jan 28 20:06:20 crc kubenswrapper[4749]: I0128 20:06:20.884250 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a0df881-a48a-4727-8191-f37933d80b84" path="/var/lib/kubelet/pods/5a0df881-a48a-4727-8191-f37933d80b84/volumes" Jan 28 20:06:57 crc kubenswrapper[4749]: I0128 20:06:57.466920 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:06:57 crc kubenswrapper[4749]: I0128 20:06:57.467542 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:07:27 crc kubenswrapper[4749]: I0128 20:07:27.468178 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:07:27 crc kubenswrapper[4749]: I0128 20:07:27.468820 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:07:57 crc kubenswrapper[4749]: I0128 20:07:57.467840 4749 patch_prober.go:28] interesting pod/machine-config-daemon-698zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:07:57 crc kubenswrapper[4749]: I0128 20:07:57.468363 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:07:57 crc kubenswrapper[4749]: I0128 20:07:57.468408 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-698zt" Jan 28 20:07:57 crc kubenswrapper[4749]: I0128 20:07:57.469223 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a8bf5654419aae947fd0353e431b43b4ba3418ed4cd23109eca09d9d740730d4"} pod="openshift-machine-config-operator/machine-config-daemon-698zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 20:07:57 crc kubenswrapper[4749]: I0128 20:07:57.469273 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-698zt" podUID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerName="machine-config-daemon" containerID="cri-o://a8bf5654419aae947fd0353e431b43b4ba3418ed4cd23109eca09d9d740730d4" gracePeriod=600 Jan 28 20:07:58 crc kubenswrapper[4749]: I0128 20:07:58.221850 4749 generic.go:334] "Generic (PLEG): container finished" podID="1841c82d-7cd1-4c14-b54d-794bbb647776" containerID="a8bf5654419aae947fd0353e431b43b4ba3418ed4cd23109eca09d9d740730d4" exitCode=0 Jan 28 20:07:58 crc kubenswrapper[4749]: I0128 20:07:58.221926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerDied","Data":"a8bf5654419aae947fd0353e431b43b4ba3418ed4cd23109eca09d9d740730d4"} Jan 28 20:07:58 crc kubenswrapper[4749]: I0128 20:07:58.222726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-698zt" event={"ID":"1841c82d-7cd1-4c14-b54d-794bbb647776","Type":"ContainerStarted","Data":"4e75d85859741f2fb92630dcfcfea54ab21c1915d31119709b20440ca7270f5d"} Jan 28 20:07:58 crc kubenswrapper[4749]: I0128 20:07:58.222802 4749 scope.go:117] "RemoveContainer" containerID="df2be45436073693e561d08c7046f476b76e6da3d3ef97b8e1e0870455a2f911" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515136466333024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015136466333017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015136453015016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015136453015015460 5ustar corecore